Human Generated Data

Title

Untitled (Horse Dance, Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2326

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Horse Dance, Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2326

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

People 99.9
Person 99
Person 99
Person 98.9
Adult 98.9
Female 98.9
Woman 98.9
Person 98.5
Clothing 97.9
Shorts 97.9
Person 96.5
Boy 96.5
Child 96.5
Male 96.5
Person 95.2
Person 95
Person 87.9
Person 86.6
Person 84.1
Person 82.5
Face 78.4
Head 78.4
Machine 75.4
Wheel 75.4
Outdoors 74.7
Back 74.6
Body Part 74.6
Animal 74
Elephant 74
Mammal 74
Wildlife 74
Dancing 73.4
Leisure Activities 73.4
Person 68
Nature 61.6
Bicycle 60.8
Transportation 60.8
Vehicle 60.8
Stilts 55.1

Clarifai
created on 2018-05-10

people 100
group together 99.7
group 99.2
many 99.1
adult 97.8
man 95.9
wear 95.9
several 94.1
dancing 92.6
child 91.3
military 89.7
spectator 88.4
woman 88
outfit 87.2
recreation 86.2
five 84.8
dancer 81.1
music 80.9
veil 80.8
athlete 80.5

Imagga
created on 2023-10-05

innocent 43.6
child 37.6
beach 33
man 24.9
person 24.8
walking 23.7
people 23.4
vacation 22.9
family 22.2
male 21.4
together 20.1
couple 20
summer 19.9
boy 19.1
sand 18.6
travel 16.9
lifestyle 16.6
ocean 16.6
love 16.6
holiday 16.5
fun 16.5
walk 16.2
water 16
mother 15.7
outdoors 15.2
parent 15.1
pedestrian 15.1
happy 15
father 15
sea 14.9
outdoor 14.5
two 14.4
coast 13.5
dad 13.4
sky 13.4
leisure 13.3
juvenile 13.1
tropical 12.8
active 12.7
adult 12.3
old 11.8
son 11.7
kid 11.5
kin 11.4
sport 11.3
clothing 10.7
shore 10.2
girls 10
life 9.6
hands 9.6
daughter 9.5
women 9.5
sunny 9.5
day 9.4
happiness 9.4
world 9.2
joy 9.2
children 9.1
holding 9.1
park 9.1
sarong 9
recreation 9
smiling 8.7
friends 8.5
relax 8.4
playing 8.2
exercise 8.2
group 8.1
smile 7.8
play 7.8
run 7.7
kids 7.5
coastline 7.5
senior 7.5
protection 7.3
danger 7.3
sunset 7.2
skirt 7.1
portrait 7.1
animal 7.1
autumn 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.3
outdoor 99.1
sport 81.1
standing 80.9
group 73.2
dancer 64.6
posing 59.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 23-31
Gender Male, 100%
Calm 98.8%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.3%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 35-43
Gender Male, 99.6%
Angry 34.6%
Calm 29.8%
Surprised 14.8%
Happy 13.8%
Fear 6.7%
Sad 3.7%
Disgusted 2.3%
Confused 2.2%

AWS Rekognition

Age 4-12
Gender Female, 96.9%
Fear 97.7%
Surprised 6.3%
Sad 2.2%
Calm 1.9%
Angry 0.4%
Happy 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 18-24
Gender Female, 99.7%
Calm 51%
Disgusted 18.3%
Fear 12.9%
Angry 7.8%
Surprised 7.2%
Sad 4.8%
Confused 1.8%
Happy 0.7%

AWS Rekognition

Age 29-39
Gender Female, 71.5%
Fear 91%
Calm 10%
Surprised 6.6%
Sad 6.3%
Disgusted 2.1%
Happy 1.4%
Confused 1.4%
Angry 0.8%

AWS Rekognition

Age 16-22
Gender Female, 62.4%
Calm 90.3%
Fear 7.7%
Surprised 6.7%
Angry 2.4%
Sad 2.3%
Confused 1.1%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 24-34
Gender Male, 71.8%
Sad 99.8%
Calm 26.3%
Surprised 6.3%
Fear 6%
Happy 1.8%
Confused 0.6%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 25-35
Gender Male, 89.9%
Sad 99.4%
Confused 26.3%
Surprised 6.7%
Fear 6.2%
Calm 5.3%
Happy 1.9%
Disgusted 1.1%
Angry 1.1%

AWS Rekognition

Age 21-29
Gender Male, 54.5%
Sad 99.8%
Calm 10.6%
Fear 7.6%
Surprised 7%
Disgusted 6.7%
Angry 4.4%
Happy 1.7%
Confused 1%

Microsoft Cognitive Services

Age 21
Gender Male

Feature analysis

Amazon

Person 99%
Adult 98.9%
Female 98.9%
Woman 98.9%
Boy 96.5%
Child 96.5%
Male 96.5%
Wheel 75.4%
Elephant 74%
Bicycle 60.8%