Human Generated Data

Title

Trampoline

Date

1960

People

Artist: Harold Edgerton, American 1903 - 1990

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of The Harold and Esther Edgerton Family Foundation, P1996.49

Human Generated Data

Title

Trampoline

People

Artist: Harold Edgerton, American 1903 - 1990

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of The Harold and Esther Edgerton Family Foundation, P1996.49

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Acrobatic 98.9
Human 98.9
Gymnastics 87.4
Sport 87.4
Sports 87.4
Leisure Activities 78.9
Person 78.5
Bird 66.6
Animal 66.6
Athlete 63.7
Gymnast 58.2
Person 57.6
Balance Beam 55.8

Clarifai
created on 2023-10-15

ballet 99.1
people 98.4
ballerina 97.8
dancer 97.6
one 97.5
balance 96.9
action 96.3
man 95.8
skill 94.5
no person 94.2
adult 93.7
agility 93.5
motion 91.6
art 91.3
woman 90.8
music 89.6
gymnastics 88.1
two 87.6
jumping 87.2
action energy 87

Imagga
created on 2021-12-14

sculpture 30.7
statue 29.4
fountain 28.6
art 25.9
sky 19.1
structure 17.4
body 16.8
fin 14.9
man 14.8
ancient 14.7
stone 14.5
religion 14.3
tree 13.9
dance 13.9
jump 13.4
action 13
marble 13
water 12.7
god 12.4
bird 12.3
fly 12.1
male 12
stabilizer 12
freedom 11.9
jumping 11.6
sport 11.5
black 10.8
knot 10.4
culture 10.2
exercise 10
rope 10
active 9.9
travel 9.8
roman 9.7
human 9.7
adult 9.7
muscle 9.6
flying 9.5
architecture 9.4
device 9.3
air 9.2
creation 9.2
wing 9.2
fastener 9.1
fitness 9
outdoors 8.9
naked 8.7
feather 8.6
muscular 8.6
outdoor 8.4
old 8.4
symbol 8.1
wildlife 8
line 7.9
athlete 7.9
holy 7.7
wings 7.7
woody plant 7.6
arm 7.6
person 7.5
religious 7.5
monument 7.5
sketch 7.4
park 7.4
back 7.3

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.9
drawing 83.7
window 80.8
cartoon 80.7
sketch 76.8
black 72.5
dance 68.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-23
Gender Female, 67%
Fear 77.1%
Angry 8.2%
Sad 6.6%
Calm 3.3%
Surprised 1.9%
Confused 1.3%
Happy 1.1%
Disgusted 0.4%

AWS Rekognition

Age 22-34
Gender Male, 94.5%
Calm 91.6%
Sad 7.2%
Angry 0.3%
Confused 0.2%
Surprised 0.2%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 78.5%
Bird 66.6%

Captions

Microsoft
created on 2021-12-14

an old photo of a woman 66.7%
a woman posing for a photo 66.1%
old photo of a woman 64.3%