Human Generated Data

Title

Maya

Date

1942-1947, printed 1987

People

Artist: Alexander Hammid, American 1907 - 2004

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.254

Copyright

© Alexander Hammid

Human Generated Data

Title

Maya

People

Artist: Alexander Hammid, American 1907 - 2004

Date

1942-1947, printed 1987

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Hair 100
Person 90.1
Human 90.1
Face 78.9
Black Hair 73.2
Portrait 62.2
Photography 62.2
Photo 62.2

Clarifai
created on 2018-02-10

portrait 99.8
people 99.7
monochrome 98.6
one 98.2
adult 97.7
music 97.3
musician 95.5
singer 93.8
girl 90.9
wear 89
man 88.7
art 87.6
woman 87.5
profile 86.9
model 86.9
retro 84
actress 83.4
fashion 83.1
songwriter 82.9
smoke 82.8

Imagga
created on 2018-02-10

black 55.4
hair 49.2
face 49.1
portrait 48.6
model 43.6
attractive 41.3
wig 38.9
adult 36.9
fashion 36.2
pretty 35.7
sexy 35.4
eyes 35.3
person 33.6
lips 33.4
lady 31.7
skin 31.5
hairpiece 31.2
makeup 28.7
brunette 27.9
sensual 27.3
cute 26.6
make 25.4
sensuality 24.6
attire 23.5
people 22.9
dark 21.7
posing 21.4
lock 21
smile 20
studio 19.8
hairstyle 19.1
expression 18.8
elegance 18.5
look 18.4
body 18.4
happy 18.2
lovely 17.8
clothing 17.4
human 17.3
lifestyle 16.7
women 16.6
long 16.5
head 16
brown 15.5
doll 14.9
youth 14.5
one 14.2
style 14.1
looking 13.6
mouth 13.2
close 13.1
cosmetics 12.2
plaything 11.9
smiling 11.6
ethnic 11.4
natural 11.4
healthy 11.4
cheerful 10.6
seductive 10.5
elegant 10.3
feminine 10.3
glamor 9.6
nose 9.6
closeup 9.4
girls 9.1
child 9
eye 8.9
covering 8.7
luxury 8.6
male 8.5
afro 8.4
hand 8.4
nice 8.3
gorgeous 8.2
glamorous 7.8
females 7.6
charming 7.6
fun 7.5
emotion 7.4
blond 7.3

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 91.3
hairpiece 44

Face analysis

Amazon

Google

AWS Rekognition

Age 26-43
Gender Female, 97.4%
Confused 3.5%
Surprised 2%
Angry 2.9%
Happy 1%
Calm 30.6%
Disgusted 1.7%
Sad 58.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 90.1%

Captions

Microsoft

a woman looking at the camera 89.7%
a woman in a black shirt 85%
a person posing for the camera 84.9%