Human Generated Data

Title

Maya

Date

1942-1947, printed 1987

People

Artist: Alexander Hammid, American 1907 - 2004

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.244

Copyright

© Alexander Hammid

Human Generated Data

Title

Maya

People

Artist: Alexander Hammid, American 1907 - 2004

Date

1942-1947, printed 1987

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.244

Copyright

© Alexander Hammid

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Candle 99.7
Person 96.8
Human 96.8
Hair 94.2
Vigil 55.8
Fire 55.1

Clarifai
created on 2018-02-10

people 99.9
portrait 99.6
one 99.4
adult 98.2
woman 95.4
music 93.9
light 88.6
wear 87.2
profile 86.6
musician 84.8
candle 83.4
singer 80.3
man 77.6
facial expression 76.6
monochrome 76.4
art 75
actor 74.5
sit 74
side view 72.8
book series 71.8

Imagga
created on 2018-02-10

candle 66.8
black 29.1
portrait 27.8
adult 27.2
face 27
dark 25.9
makeup 25.2
sexy 24.9
toiletry 23.9
lipstick 23.3
person 23.1
pretty 23.1
eyes 22.4
attractive 21.7
fashion 21.1
lady 21.1
model 21
hair 20.6
lips 20.4
brunette 20.1
people 19.5
cute 17.2
expression 17.1
human 15.8
sensual 15.5
light 15.1
flame 15
skin 14.6
lifestyle 14.5
love 14.2
cosmetic 13.5
man 13.4
body 12.8
make 12.7
look 12.3
fire 12.2
male 12.2
style 11.9
close 11.4
hand 11.4
mouth 11.3
head 10.9
dress 9.9
candles 9.8
burn 9.6
women 9.5
device 9.5
happy 9.4
glow 9.2
studio 9.1
posing 8.9
darkness 8.8
looking 8.8
hands 8.7
nose 8.6
celebration 8
candlelight 7.9
smile 7.8
antique 7.8
vintage 7.7
burning 7.7
elegant 7.7
serious 7.6
erotic 7.6
cosmetics 7.5
one 7.5
closeup 7.4
natural 7.4
20s 7.3
book 7.3
sensuality 7.3
romance 7.1
lovely 7.1

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 99.1
man 94.6
indoor 90.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-55
Gender Female, 99.4%
Surprised 2.2%
Confused 1%
Calm 67.5%
Happy 0.1%
Disgusted 1.7%
Angry 3.7%
Sad 23.8%

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Candle 99.7%
Person 96.8%