Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5177

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.5177

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Boy 99.1
Male 99.1
Person 99.1
Teen 99.1
Person 98.3
Person 98.3
Child 98.3
Female 98.3
Girl 98.3
Person 97.4
Person 96.4
Person 95.8
Person 92.9
Female 92.9
Adult 92.9
Bride 92.9
Wedding 92.9
Woman 92.9
Person 91
Person 88
Person 87.9
Machine 79.6
Wheel 79.6
Person 76.7
People 75.5
Person 75.3
Person 73.1
Face 67.8
Head 67.8
Outdoors 62.4
Photography 57.4
Tent 57.3
Stilts 57.3
Clothing 56.4
Hat 56.4
Costume 55.6
Bag 55.5
Camping 55.4
Electrical Device 55.3
Microphone 55.3

Clarifai
created on 2018-05-10

people 100
many 99.5
group 99.4
group together 99.3
adult 96.2
military 95.7
man 95.7
administration 94.5
child 93.5
war 92.6
leader 91.8
several 91.7
crowd 89.9
wear 89.1
soldier 87.7
veil 85.4
spectator 83.4
woman 81
outfit 77.7
uniform 71.5

Imagga
created on 2023-10-05

engineer 27.1
pedestrian 24.9
uniform 24
man 20.8
travel 19
people 19
clothing 18.3
person 18
danger 17.3
military 16.4
sport 16.1
private 16
male 15.6
military uniform 15.1
tourist 15.1
protection 14.5
innocent 13.7
mask 13.4
weapon 13
soldier 12.7
vacation 12.3
sand 12.2
outdoors 12.1
two 11.9
beach 11.8
leisure 11.6
tourism 11.5
sky 11.5
old 11.1
patriot 11
adult 11
industrial 10.9
destruction 10.7
toxic 10.7
history 10.7
protective 10.7
nuclear 10.7
fun 10.5
men 10.3
dirty 9.9
radioactive 9.8
camouflage 9.8
radiation 9.8
accident 9.8
chemical 9.6
gas 9.6
war 9.6
gun 9.6
animal 9.3
stone 9.3
outdoor 9.2
road 9
stalker 8.9
couple 8.7
ancient 8.6
day 8.6
happiness 8.6
sea 8.6
horse 8.5
smoke 8.4
summer 8.4
competition 8.2
happy 8.1
activity 8.1
holiday 7.9
industry 7.7
extreme 7.7
tropical 7.7
traditional 7.5
desert 7.5
ocean 7.5
covering 7.4
child 7.4
tradition 7.4
speed 7.3
sun 7.2
equipment 7.2
recreation 7.2
family 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99
person 94
posing 76.5
sport 66
people 64.8
dancer 61.6
group 58.5
old 52.6
clothes 21.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Calm 98.4%
Surprised 6.7%
Fear 5.9%
Sad 2.2%
Confused 0.2%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 11-19
Gender Male, 71.6%
Calm 74.7%
Fear 8.1%
Angry 7.8%
Surprised 7.3%
Confused 4.6%
Sad 3.3%
Happy 1.7%
Disgusted 1.2%

AWS Rekognition

Age 6-14
Gender Male, 99.5%
Surprised 55.4%
Calm 47.3%
Fear 7.7%
Angry 7.5%
Sad 2.9%
Disgusted 2.6%
Confused 1.2%
Happy 0.9%

AWS Rekognition

Age 18-26
Gender Female, 60.3%
Calm 66.9%
Sad 47.7%
Surprised 6.4%
Fear 6.3%
Angry 3.6%
Disgusted 1%
Happy 0.6%
Confused 0.3%

AWS Rekognition

Age 20-28
Gender Female, 98.4%
Surprised 23%
Happy 20.1%
Disgusted 17.8%
Calm 14.9%
Fear 11.8%
Angry 7.7%
Sad 7.7%
Confused 1.4%

AWS Rekognition

Age 13-21
Gender Male, 56.4%
Calm 30.9%
Angry 21.2%
Disgusted 20.7%
Sad 19.2%
Surprised 7.3%
Fear 6.8%
Happy 4.3%
Confused 1.9%

AWS Rekognition

Age 7-17
Gender Female, 97.3%
Sad 100%
Fear 6.6%
Surprised 6.4%
Calm 5.4%
Disgusted 1.5%
Happy 0.8%
Angry 0.7%
Confused 0.3%

AWS Rekognition

Age 2-10
Gender Male, 98.3%
Sad 92.8%
Happy 26%
Calm 15.9%
Fear 7.4%
Surprised 6.6%
Angry 6%
Confused 1.9%
Disgusted 1.2%

AWS Rekognition

Age 2-8
Gender Female, 99.8%
Calm 42.1%
Fear 38%
Happy 15.5%
Surprised 6.6%
Angry 4.5%
Sad 2.6%
Disgusted 1%
Confused 0.7%

Microsoft Cognitive Services

Age 11
Gender Male

Feature analysis

Amazon

Boy 99.1%
Male 99.1%
Person 99.1%
Teen 99.1%
Child 98.3%
Female 98.3%
Girl 98.3%
Adult 92.9%
Bride 92.9%
Woman 92.9%
Wheel 79.6%