Human Generated Data

Title

Untitled (man holding dead child in arms)

Date

c. 1934, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5838

Human Generated Data

Title

Untitled (man holding dead child in arms)

People

Artist: Durette Studio, American 20th century

Date

c. 1934, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.2
Clothing 96.9
Apparel 96.9
Accessories 95.2
Accessory 95.2
Tie 95.2
Overcoat 93.1
Coat 93.1
Suit 93.1
Person 86
Tuxedo 77.6
Leisure Activities 76.1
Gown 72.3
Fashion 72.3
Robe 69.5
Wedding 68.9
Interior Design 65.9
Indoors 65.9
Finger 64.6
Crowd 64.5
Wedding Gown 62.8
Musical Instrument 60.2
Musician 60.2
Furniture 55.6

Clarifai
created on 2019-11-16

people 99.8
group 97.3
music 94.8
adult 94.4
man 94.4
woman 93.6
audience 93.3
group together 92.1
monochrome 90.4
leader 88.5
movie 87.7
portrait 85.8
actor 85.2
many 82.8
crowd 79.8
chair 78.8
child 73.6
wear 73.4
musician 73.2
war 70.8

Imagga
created on 2019-11-16

black 30.9
person 24.7
people 24.6
grand piano 24.3
piano 23.9
musical instrument 23.7
body 23.2
model 22.6
adult 22.2
stringed instrument 21.9
attractive 21
sexy 20.9
keyboard instrument 19.8
fashion 19.6
silhouette 18.2
percussion instrument 18
lady 17.9
style 17.8
man 16.8
elegance 16
portrait 15.5
studio 15.2
hair 15.1
male 14.2
women 14.2
expression 13.7
sensuality 13.6
human 13.5
pretty 13.3
posing 12.4
one 12
music 11.7
dark 11.7
erotic 11.6
sitting 11.2
elegant 11.1
dance 11
skin 11
face 10.7
slim 10.1
sunset 9.9
nude 9.7
lingerie 9.6
seductive 9.6
hands 9.6
love 9.5
lifestyle 9.4
television 9.3
business 9.1
performer 9
shadow 9
art 8.9
happy 8.8
dancer 8.6
fun 8.2
sensual 8.2
chair 8.2
gorgeous 8.2
dress 8.1
device 8
businessman 7.9
brunette 7.8
party 7.7
dancing 7.7
motion 7.7
wind instrument 7.7
performance 7.7
passion 7.5
clothing 7.4
figure 7.2
looking 7.2

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 97
text 96.5
indoor 92.7
black and white 86.2
person 64.3
clothing 59.6
watching 46.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 37-55
Gender Male, 86.1%
Fear 0.1%
Happy 0.1%
Calm 80.5%
Surprised 0.2%
Disgusted 0.2%
Angry 3.2%
Sad 14.7%
Confused 1%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Tie 95.2%
Person 86%

Captions

Microsoft

a person standing in front of a television 45.4%
a person standing in front of a television 45.3%