Human Generated Data

Title

Untitled (wake)

Date

1950

People

Artist: W. Eugene Smith, American 1918 - 1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.932

Human Generated Data

Title

Untitled (wake)

People

Artist: W. Eugene Smith, American 1918 - 1978

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.6
Human 98.6
Person 98.4
Person 97.9
Person 94.2
Person 94.1
Person 90.8
Art 80.9
Painting 80.9
People 69.1
Head 58.5
Batman 58.1
Portrait 57.3
Photography 57.3
Face 57.3
Photo 57.3

Imagga
created on 2022-01-09

black 55.8
person 33
people 27.3
male 26.2
dark 25.9
man 25.5
portrait 25.2
sexy 24.9
adult 24
model 22.6
attractive 21
fashion 19.6
face 19.2
night 17.8
hair 17.4
mask 17.3
human 16.5
style 14.8
clothing 14.8
pretty 14.7
sensual 14.5
dress 14.5
one 14.2
evil 13.7
sensuality 13.6
looking 13.6
posing 13.3
covering 13.3
robe 12.9
makeup 12.8
expression 12.8
garment 12.4
passion 12.2
lady 12.2
make 11.8
suit 11.8
studio 11.4
body 11.2
love 11
world 10.8
criminal 10.8
party 10.3
hot 10
silhouette 9.9
costume 9.9
hand 9.9
crime 9.7
horror 9.7
scary 9.7
brunette 9.6
seductive 9.6
eyes 9.5
erotic 9.5
lifestyle 9.4
youth 9.4
danger 9.1
sorcerer 8.9
couple 8.7
women 8.7
sitting 8.6
clothes 8.4
pose 8.2
stylish 8.1
darkness 7.8
men 7.7
modern 7.7
sexual 7.7
desire 7.7
mystery 7.7
dancer 7.6
disguise 7.6
dangerous 7.6
dance 7.6
elegance 7.6
buddy 7.4

Google
created on 2022-01-09

Flash photography 88.5
Hat 84.8
Style 83.8
Sleeve 83.6
Black-and-white 83.5
Tints and shades 76.9
T-shirt 75.7
Monochrome photography 74
Monochrome 73.4
Event 73.3
Darkness 72.2
Fun 71.9
Baseball cap 70.9
Room 68.3
Crew 68.2
Vintage clothing 67.1
Font 65.6
Comfort 62.7
Sitting 61
Team 57.3

Microsoft
created on 2022-01-09

person 98.2
human face 97.4
clothing 97
black and white 89.9
smile 82.8
man 77.7
woman 70.9
text 68.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 95.6%
Sad 71%
Calm 28.5%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0%
Confused 0%

AWS Rekognition

Age 23-31
Gender Female, 99.7%
Calm 98.9%
Sad 0.9%
Surprised 0.1%
Angry 0%
Confused 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 95%
Calm 82.1%
Sad 7.6%
Angry 6.2%
Surprised 1.2%
Disgusted 1.1%
Fear 0.8%
Confused 0.7%
Happy 0.3%

AWS Rekognition

Age 34-42
Gender Male, 87.8%
Sad 52.5%
Fear 29.3%
Calm 10.2%
Surprised 3.2%
Angry 2.7%
Confused 0.9%
Disgusted 0.7%
Happy 0.4%

AWS Rekognition

Age 52-60
Gender Male, 98.2%
Sad 84.2%
Calm 15.7%
Confused 0%
Disgusted 0%
Happy 0%
Angry 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Calm 96.3%
Sad 1.4%
Happy 0.7%
Angry 0.4%
Fear 0.4%
Disgusted 0.3%
Confused 0.2%
Surprised 0.2%

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Painting 80.9%

Captions

Microsoft

a group of people sitting in a dark room 87.8%
a group of people in a dark room 87.7%
a group of people sitting around a baseball field 51%