Human Generated Data

Title

Maya

Date

1942-1947, printed 1987

People

Artist: Alexander Hammid, American 1907 - 2004

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.249

Copyright

© Alexander Hammid

Human Generated Data

Title

Maya

People

Artist: Alexander Hammid, American 1907 - 2004

Date

1942-1947, printed 1987

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Person 92.1
Human 92.1
Furniture 86.3
Finger 84.6
Clothing 81.3
Apparel 81.3
Newborn 79.5
Baby 79.5
Painting 73.4
Art 73.4
Couch 55.9

Clarifai
created on 2018-02-10

people 99.9
one 99.2
portrait 99.1
adult 98.8
man 96.1
actress 94.8
woman 93.6
monochrome 93.4
two 93.2
furniture 90.9
wear 89.3
facial expression 88.5
music 88.2
reclining 86.7
sit 84.8
seat 84.2
musician 82.6
singer 79.2
nude 78.1
writer 77.4

Imagga
created on 2018-02-10

person 37
adult 31.1
armchair 29.3
attractive 27.3
sexy 25.7
people 25.1
portrait 24.6
happy 24.5
adolescent 21.9
model 21.8
hair 21.4
man 20.8
body 20.8
mother 20.7
sitting 19.8
skin 19.7
pretty 19.6
juvenile 18.9
home 18.4
love 18.2
black 18.1
male 17.7
child 17.7
happiness 17.2
parent 17.2
lady 17.1
sofa 16.7
face 16.3
looking 16
fashion 15.8
lifestyle 14.5
smile 14.3
posing 14.2
cute 13.6
human 13.5
one 13.4
studio 12.9
youth 12.8
sensual 12.7
women 12.7
couple 12.2
erotic 11.8
couch 11.6
smiling 10.9
family 10.7
lovely 10.7
desire 10.6
look 10.5
brunette 10.5
sensuality 10
cheerful 9.8
indoors 9.7
room 9.6
bed 9.6
father 9.6
wife 9.5
relationship 9.4
casual 9.3
lingerie 9.3
relax 9.3
elegance 9.2
gorgeous 9.1
pose 9.1
dress 9
interior 8.9
husband 8.8
together 8.8
belly 8.7
resting 8.6
expression 8.5
clothing 8.5
legs 8.5
scholar 8.4
brother 8.4
joy 8.4
indoor 8.2
dad 8
boy 7.8
eyes 7.8
men 7.7
elegant 7.7
health 7.6
guy 7.6
dark 7.5
holding 7.4
lips 7.4
laptop 7.4
makeup 7.3
relaxing 7.3
romance 7.1
handsome 7.1

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 99.6
indoor 95.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 99.7%
Confused 6.1%
Calm 60.9%
Happy 6.1%
Disgusted 13.8%
Angry 3.6%
Sad 6%
Surprised 3.5%

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.1%
Painting 73.4%

Captions

Microsoft

a person sitting on a bed 64.9%
a girl sitting on a bed 43.3%
a person sitting on a table 43.2%