Human Generated Data

Title

Untitled

Date

1997

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.255

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

1997

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Doll 96.8
Toy 96.8
Clothing 96.6
Hat 96.6
Apparel 96.6
Person 87.1
Human 87.1
Barbie 59.8
Figurine 59.8

Clarifai
created on 2018-11-05

portrait 99.6
girl 99.6
model 99
fashion 98.9
woman 98.7
beautiful 98.2
adult 97.7
people 97.4
one 96.9
lid 95.8
dress 95.2
hairdo 94.6
studio 93.9
face 93.7
retro 93.6
wear 92.8
lady 92.8
fashionable 91.1
style 89.6
glamour 89.5

Imagga
created on 2018-11-05

doll 73.2
plaything 58
model 46.8
attractive 46.3
portrait 44.1
fashion 43.1
pretty 42.1
cover girl 40.4
sexy 40.3
hair 39.7
face 36.3
person 35.9
adult 35
posing 33.9
lady 31.7
brunette 31.4
smasher 30.7
people 30.2
cute 29.5
human 24.8
makeup 23.9
sensuality 23.7
happy 23.2
black 23.2
make 22.7
dress 22.6
skin 22.1
sensual 21.9
eyes 21.6
expression 21.4
lips 21.3
elegance 21
smile 20.7
one 20.2
body 19.2
youth 18.8
looking 18.4
hairstyle 18.2
style 17.8
women 17.4
lovely 16.9
look 16.7
smiling 16.7
gorgeous 16.4
elegant 16.3
casual 16.1
studio 16
pose 14.5
teenager 13.7
stylish 13.6
modern 13.4
blond 13.3
clothing 13.3
clothes 12.2
natural 12.1
nice 11.9
lifestyle 11.6
hand 11.4
fashionable 11.4
head 10.9
hot 10.9
closeup 10.8
eye 10.7
cosmetic 10.6
close 10.3
teen 10.1
wellness 9.2
hat 8.9
erotic 8.7
jewelry 8.7
vogue 8.7
healthy 8.2
cheerful 8.1
brown 8.1
happiness 7.9
flirt 7.8
tender 7.7
seductive 7.7
shoulder 7.6
joy 7.5
holding 7.4
slim 7.4
long 7.4
confident 7.3
bright 7.2

Google
created on 2018-11-05

modern art 76.8
art 64.9
girl 63.6
picture frame 61.3
painting 52.9

Microsoft
created on 2018-11-05

wall 95.8
indoor 92.6
person 91.4
electronics 83.5
display 56.4
picture frame 19.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-15
Gender Female, 99.9%
Calm 15.7%
Disgusted 1.7%
Confused 5.4%
Sad 9.7%
Angry 5.7%
Happy 0.7%
Surprised 61%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Hat 96.6%
Person 87.1%

Captions

Microsoft

a person standing in front of a statue 47.1%
a person posing for the camera 47%
a person standing in front of a screen 46.9%

Text analysis

Amazon

1992
D-e
D-e wpo 1992 Y5
Y5
wpo