Human Generated Data

Title

Untitled (Indian temple, Singapore)

Date

February 17, 1960-February 20, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2369

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Indian temple, Singapore)

People

Artist: Ben Shahn, American 1898 - 1969

Date

February 17, 1960-February 20, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2369

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Crypt 99.5
Arch 98.6
Architecture 98.6
Person 98.5
Person 98.5
Adult 98.5
Bride 98.5
Female 98.5
Wedding 98.5
Woman 98.5
Person 97.8
Person 97.4
Building 95.5
Monastery 95.5
Floor 95.5
Corridor 85.4
Indoors 85.4
Animal 85.1
Bird 85.1
Flooring 84.6
Art 83.8
Painting 83.8
Pillar 80.9
Machine 79.1
Wheel 79.1
Face 76.1
Head 76.1
Housing 64.4
House 56.2
Porch 56.2
Cross 56.1
Symbol 56.1
Archaeology 55.3

Clarifai
created on 2018-05-10

people 99.1
room 98.6
door 96.5
street 94.5
no person 94.3
furniture 93.8
adult 93.6
man 93.4
doorway 92.8
monochrome 92
group 92
indoors 90.2
wear 89.8
family 88
window 86.2
woman 84.7
home 83.9
stock 81.9
one 81.9
group together 81.6

Imagga
created on 2023-10-07

wardrobe 72.1
furniture 59.8
furnishing 43.6
door 29.9
room 18.8
interior 17.7
bookend 17.6
wall 17.5
support 17
house 15.9
building 15.5
boutique 14.8
entrance 14.5
architecture 13.9
business 12.7
device 12.7
old 12.5
city 12.5
binder 12.3
urban 12.2
home 12
open 11.7
design 11.2
protective covering 11.2
office 10.9
sliding door 10.4
covering 10.4
window 10.3
construction 10.3
nobody 10.1
doors 9.8
inside 9.2
new 8.9
metal 8.8
indoors 8.8
light 8.7
3d 8.5
floor 8.4
frame 8.3
market 8
hall 7.9
render 7.8
glass 7.8
travel 7.7
toilet 7.7
horizontal 7.5
equipment 7.3
success 7.2
black 7.2
art 7.2
to 7.1
modern 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

white 81.4
black 80.3
store 30.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 29-39
Gender Female, 59.8%
Sad 80.3%
Fear 59.3%
Surprised 9.1%
Angry 3.7%
Disgusted 3.4%
Calm 2.4%
Confused 1.6%
Happy 0.7%

AWS Rekognition

Age 22-30
Gender Female, 95.7%
Sad 100%
Surprised 6.3%
Fear 6.2%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%
Calm 0%
Confused 0%

AWS Rekognition

Age 16-24
Gender Female, 99.6%
Fear 73.4%
Disgusted 33.4%
Surprised 7.8%
Sad 2.7%
Angry 2.5%
Calm 1.5%
Confused 1%
Happy 0.9%

Microsoft Cognitive Services

Age 13
Gender Male

Feature analysis

Amazon

Person 98.5%
Adult 98.5%
Bride 98.5%
Female 98.5%
Woman 98.5%
Bird 85.1%
Wheel 79.1%

Categories

Text analysis

Amazon

de