Human Generated Data

Title

Marlene Dietrich, New York

Date

1932, printed 1985

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.180.7

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Marlene Dietrich, New York

People

Artist: Edward Steichen, American 1879 - 1973

Date

1932, printed 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.180.7

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Plant 99.3
Human 98.7
Blossom 98.2
Flower 98.2
Flower Arrangement 97.4
Person 96.9
Flower Bouquet 96
Apparel 88.6
Clothing 88.6
Ikebana 85.4
Pottery 85.4
Vase 85.4
Art 85.4
Ornament 85.4
Jar 85.4
Leisure Activities 71.1
Photography 60.6
Photo 60.6
Dance Pose 59.2
Face 59
Portrait 59

Clarifai
created on 2018-02-09

people 99.9
adult 99.4
two 99.2
one 98.9
woman 98.5
wear 98.3
furniture 96.5
facial expression 96.3
portrait 96
man 94.8
actress 94.7
wedding 93.6
music 93.5
dress 92.8
veil 91.1
bride 90.5
girl 89.3
flower arrangement 89
outfit 85.3
dancer 82.1

Imagga
created on 2018-02-09

harp 75.7
adult 28.9
person 26.8
attractive 26.6
sexy 25.7
portrait 25.2
fashion 24.1
support 23.6
device 23
pretty 22.4
hair 20.6
lifestyle 20.2
people 20.1
interior 19.5
black 18.9
model 18.7
indoors 18.5
dress 18.1
face 17.8
brunette 17.4
sitting 17.2
one 17.2
lady 17
human 16.5
home 16
body 15.2
posing 15.1
cute 15.1
room 15.1
women 15
style 14.8
sensual 14.6
sensuality 14.5
man 12.1
looking 12
skin 11.9
holding 11.6
passion 11.3
elegant 11.1
elegance 10.9
house 10.9
male 10.8
smile 10.7
cheerful 10.6
happiness 10.2
casual 10.2
emotion 10.1
alone 10
gorgeous 10
hand 9.9
lovely 9.8
couple 9.6
love 9.5
legs 9.4
happy 9.4
relaxation 9.2
nice 9.2
working 8.8
smiling 8.7
sad 8.7
expression 8.5
clothing 8.4
joy 8.4
dark 8.4
stylish 8.1
bed 8
look 7.9
chair 7.9
luxury 7.7
wall 7.7
bass 7.6
erotic 7.6
domestic 7.5
blond 7.5
leisure 7.5
lips 7.4
window 7.3
makeup 7.3
indoor 7.3

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

person 97.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 95.5%
Surprised 4.5%
Angry 5.6%
Disgusted 11.4%
Confused 20.4%
Happy 15.3%
Calm 37.1%
Sad 5.7%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%

Captions

Text analysis

Amazon

ao: