Human Generated Data

Title

Agnes Meyer

Date

c. 1915, printed 1984

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.182.4

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Agnes Meyer

People

Artist: Edward Steichen, American 1879 - 1973

Date

c. 1915, printed 1984

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 97.2
Face 97.2
Person 96.1
Apparel 91.2
Clothing 91.2
Flower 86.5
Plant 86.5
Blossom 86.5
Finger 81.1
Female 74.5
Photography 66.6
Photo 66.6
Portrait 66.6
Girl 62.8
Cake 57.8
Creme 57.8
Food 57.8
Dessert 57.8
Cream 57.8
Icing 57.8
Hat 57.5

Clarifai
created on 2018-02-09

people 99.9
adult 99.4
one 98.6
portrait 98.6
wear 96.8
woman 96.3
two 95.1
man 94.9
facial expression 92.1
actress 91.5
reclining 85.5
group 83.8
veil 83.6
administration 82.2
monochrome 81.8
furniture 81.6
leader 78.9
actor 78
three 76.1
interaction 75.1

Imagga
created on 2018-02-09

man 30.9
male 29.1
person 28.7
adult 27.8
portrait 26.6
child 25
people 24.6
dad 20.7
father 19.5
parent 19.5
love 19
face 17.1
couple 16.6
happy 16.3
home 16
husband 15.7
attractive 14.7
black 14.4
bed 14.2
lying 14.1
holding 14
baby 13.6
looking 13.6
close 13.1
hand 12.9
human 12.8
pretty 12.6
handsome 12.5
model 12.5
wife 12.3
mother 12.2
sexy 12.1
body 12
one 12
skin 11.6
smiling 10.9
care 10.7
look 10.5
hands 10.4
eyes 10.3
hair 10.3
cute 10.1
smile 10
lady 9.7
hug 9.7
indoors 9.7
bride 9.6
juvenile 9.3
guy 9.3
fashion 9.1
romantic 8.9
lifestyle 8.7
finger 8.7
married 8.6
happiness 8.6
expression 8.5
relationship 8.4
shirt 8.4
hold 8.3
wedding 8.3
make 8.2
romance 8
together 7.9
sleep 7.8
tired 7.8
bedroom 7.7
men 7.7
sitting 7.7
affection 7.7
youth 7.7
health 7.6
marriage 7.6
relaxing 7.3
kid 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 99.9
woman 97.1
indoor 96.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Male, 86.5%
Confused 2.2%
Surprised 0.9%
Angry 10.9%
Happy 0.4%
Calm 82.1%
Disgusted 0.9%
Sad 2.4%

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.1%

Captions

Microsoft

a person sitting down talking on a cell phone 54.1%
a person sitting on a bed talking on the phone 46.4%
a person sitting on a bed 46.3%