Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2333

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2333

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

People 100
Person 98.7
Person 98.6
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Person 98.5
Person 98.3
Adult 98.3
Bride 98.3
Female 98.3
Wedding 98.3
Woman 98.3
Person 98.3
Adult 98.3
Male 98.3
Man 98.3
Person 98
Person 97.8
Person 88
Dancing 85.2
Leisure Activities 85.2
Person 84.6
Face 76.5
Head 76.5
Accessories 61.8
Formal Wear 61.8
Tie 61.8
Tie 61.3
Person 56.7
Glasses 55.5

Clarifai
created on 2018-05-10

people 100
many 99.8
group together 99.8
group 99.6
adult 98.2
military 94.8
wear 94.6
outfit 94.1
man 93.8
crowd 93.5
spectator 93.1
dancing 92
woman 88.3
music 88.1
dancer 87.3
uniform 85.8
veil 84.8
administration 84.3
child 83.2
war 82.9

Imagga
created on 2023-10-06

pedestrian 36.1
innocent 24.3
uniform 23.2
man 21.5
person 19.5
military 19.3
male 19.2
people 18.4
sport 18.2
danger 17.3
child 16.3
clothing 16.3
soldier 15.6
weapon 15.3
gun 14.9
military uniform 14.2
horse 13.5
war 13.5
outdoor 13
protection 12.7
mask 12.6
adult 12.3
two 11.9
seller 11.7
outdoors 11.3
rifle 11.2
men 11.2
industrial 10.9
dirty 10.8
camouflage 10.8
accident 10.7
toxic 10.7
nuclear 10.7
animal 10.5
old 10.4
maypole 10
stalker 9.9
recreation 9.9
radioactive 9.8
radiation 9.8
destruction 9.8
protective 9.7
fun 9.7
chemical 9.7
gas 9.6
world 9.3
vacation 9
battle 8.8
army 8.8
horses 8.8
post 8.7
dirt 8.6
travel 8.4
leisure 8.3
competition 8.2
history 8
game 8
boy 7.8
walking 7.6
field 7.5
traditional 7.5
smoke 7.4
environment 7.4
tradition 7.4
safety 7.4
historic 7.3
playing 7.3
activity 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 100
tree 99
person 98.9
sport 93.4
group 90.2
people 88.9
dancer 82.9
posing 38.9
crowd 26.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 16-24
Gender Female, 99.9%
Calm 96.3%
Surprised 6.5%
Fear 6.1%
Sad 2.3%
Confused 1.4%
Disgusted 0.5%
Angry 0.4%
Happy 0.2%

AWS Rekognition

Age 23-31
Gender Male, 100%
Calm 98.8%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Angry 0.4%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 24-34
Gender Female, 66.8%
Calm 97.6%
Surprised 6.5%
Fear 6.1%
Sad 2.2%
Angry 0.5%
Confused 0.4%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 29-39
Gender Male, 90.9%
Surprised 75.5%
Calm 43%
Fear 6.4%
Disgusted 4.1%
Sad 2.8%
Angry 2.8%
Happy 1.4%
Confused 1.2%

AWS Rekognition

Age 6-14
Gender Female, 99.6%
Fear 97.6%
Surprised 6.3%
Sad 2.8%
Angry 0.7%
Disgusted 0.5%
Happy 0.1%
Calm 0.1%
Confused 0%

AWS Rekognition

Age 2-10
Gender Female, 97.2%
Fear 98%
Surprised 6.3%
Sad 2.2%
Angry 0.4%
Calm 0%
Confused 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 16-22
Gender Female, 97.1%
Sad 100%
Surprised 6.3%
Fear 6%
Calm 1.8%
Angry 1.8%
Happy 0.3%
Disgusted 0.3%
Confused 0.2%

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 17
Gender Female

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Feature analysis

Amazon

Person 98.7%
Adult 98.6%
Male 98.6%
Man 98.6%
Bride 98.3%
Female 98.3%
Woman 98.3%
Tie 61.8%
Glasses 55.5%