Human Generated Data

Title

Untitled (Indian temple, Singapore)

Date

February 17, 1960-February 20, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5370

Human Generated Data

Title

Untitled (Indian temple, Singapore)

People

Artist: Ben Shahn, American 1898 - 1969

Date

February 17, 1960-February 20, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5370

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 98
Human 98
Person 97
Person 96
Person 95.5
Symbol 93.6
Emblem 93.6
Weaponry 88.3
Weapon 88.3
Statue 80.7
Sculpture 80.7
Art 80.7
People 74
Person 72.2
Painting 71.7
Person 69.9
Spear 68.5
Trident 68.5
Military 67.9
Military Uniform 67.9
Person 65.6
Soldier 60
Person 57.1

Clarifai
created on 2018-03-23

people 99.9
group 99.8
many 99.4
military 98.3
soldier 98.2
war 97.5
adult 97.3
man 96.7
group together 96.3
vehicle 96.2
skirmish 94.8
art 94.8
weapon 94.7
sculpture 92.8
veil 92.5
one 90.1
statue 89.8
leader 89.8
wear 89.7
victory 89.6

Imagga
created on 2018-03-23

fountain 86.5
statue 65.7
sculpture 60.3
structure 56.7
art 27
architecture 26.6
monument 25.2
history 25
carving 21.8
old 19.5
stone 19
landmark 19
ancient 18.2
building 16.7
culture 16.2
city 15.8
religion 15.2
famous 14.9
soldier 14.7
temple 13.7
figure 13.4
sky 13.4
travel 13.4
china 13.2
memorial 12.9
historic 12.8
military 12.6
war 12.5
tourism 12.4
detail 12.1
man 11.9
bronze 11.8
plastic art 11.2
religious 11.2
tourist 11.1
park 10.9
lion 10.7
protection 10
decoration 9.7
antique 9.6
oriental 9.4
army 8.8
symbol 8.8
traditional 8.3
battle 7.8
face 7.8
warrior 7.8
male 7.8
marble 7.7
worship 7.7
heritage 7.7
palace 7.7
god 7.7
attraction 7.6
horse 7.6
east 7.5
style 7.4
tradition 7.4
support 7.4
metal 7.2
black 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

old 47.4
several 13.6
altar 11.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 20-38
Gender Male, 54.8%
Angry 46%
Sad 45.7%
Happy 45.2%
Surprised 45.5%
Confused 46.4%
Disgusted 46.3%
Calm 49.8%

AWS Rekognition

Age 26-43
Gender Male, 54.2%
Happy 45.1%
Disgusted 45.9%
Confused 45.7%
Calm 51.3%
Sad 45.5%
Angry 46.2%
Surprised 45.3%

AWS Rekognition

Age 30-47
Gender Female, 94.6%
Angry 4.5%
Sad 4.2%
Calm 61.8%
Disgusted 13%
Happy 3.3%
Surprised 10.7%
Confused 2.5%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Confused 45%
Surprised 45.1%
Calm 54.6%
Sad 45.1%
Disgusted 45.1%
Happy 45.1%
Angry 45%

AWS Rekognition

Age 17-27
Gender Female, 53.8%
Confused 45.2%
Angry 45.2%
Surprised 46.6%
Calm 45.5%
Happy 45.2%
Disgusted 47.2%
Sad 50.1%

AWS Rekognition

Age 26-44
Gender Female, 55.5%
Surprised 5.2%
Disgusted 3%
Confused 3.3%
Sad 12.8%
Angry 5.9%
Happy 13%
Calm 56.9%

AWS Rekognition

Age 26-43
Gender Male, 54.7%
Angry 45.7%
Surprised 45.5%
Happy 45.1%
Sad 50.6%
Calm 45.9%
Disgusted 46%
Confused 46.2%

Microsoft Cognitive Services

Age 30
Gender Male

Feature analysis

Amazon

Person 98%
Painting 71.7%