Human Generated Data

Title

Untitled

Date

1998

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.257

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

1998

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Toy 99.5
Doll 90.5
Figurine 82.5
Hat 77.3
Apparel 77.3
Clothing 77.3
Person 72
Human 72
Barbie 63

Clarifai
created on 2018-11-05

woman 99.5
fashion 98.8
girl 96.7
wear 96.2
portrait 95.6
dress 94.8
model 93.6
pretty 92.8
glamour 92.5
one 90.7
beautiful 90.6
adult 89.9
sexy 88.9
people 88.9
studio 87.5
contemporary 85.5
elegant 85.1
retro 84
young 83.8
lid 81.4

Imagga
created on 2018-11-05

costume 70
adult 38.2
fashion 33.2
attractive 32.9
dress 32.6
person 32.1
pretty 30.8
portrait 29.2
smile 28.5
blue 26.3
happy 24.5
outfit 24.1
sexy 24.1
people 24
smiling 23.2
brunette 22.7
model 22.6
lady 21.1
cute 20.8
studio 20.5
hair 19.8
cheerful 18.7
posing 18.7
clothing 18.5
women 17.4
happiness 17.3
pose 17.2
lifestyle 16.6
youth 16.2
face 15.6
elegance 15.1
look 14.9
style 14.9
hat 14.8
expression 14.5
standing 13.9
sensuality 13.6
professional 13.6
elegant 12.9
business 12.8
one 12.7
worker 12.5
job 12.4
casual 11.9
sensual 11.8
dressed 11.7
holding 11.6
20s 11
work 11
lovely 10.7
color 10.6
bag 10.4
joy 10
male 9.9
looking 9.6
body 9.6
child 9.4
slim 9.2
long 9.2
friendly 9.2
modern 9.1
human 9
fun 9
school 9
seductive 8.6
party 8.6
uniform 8.6
charming 8.5
clothes 8.4
feminine 8.4
shopping 8.4
occupation 8.3
teenager 8.2
businesswoman 8.2
gorgeous 8.2
black 8
education 7.8
joyful 7.4
performer 7.3
student 7.3
holiday 7.2

Google
created on 2018-11-05

standing 74.6
art 71.4
painting 70
girl 65.6
doll 60.1
modern art 58.5
vintage clothing 56.1
human behavior 56

Microsoft
created on 2018-11-05

wall 99.1
monitor 98
indoor 92.6
electronics 88.5
screen 80
display 61.1
picture frame 27.1
screenshot 16.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-9
Gender Female, 99.5%
Sad 1.5%
Angry 0.5%
Calm 24.5%
Surprised 68.1%
Happy 0.9%
Disgusted 0.3%
Confused 4.2%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Doll 90.5%
Hat 77.3%
Person 72%

Captions

Microsoft

a person standing in front of a television screen 84.6%
a person standing in front of a television 84.5%
a person standing in front of a screen 84.4%

Text analysis

Amazon

utee
utee 19A0 AP
AP
19A0