Human Generated Data

Title

Chinese Women: Experimental Color

Date

c. 1941

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.80.9

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Chinese Women: Experimental Color

People

Artist: Edward Steichen, American 1879 - 1973

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 97.6
Human 97.6
Musician 95.1
Musical Instrument 95.1
Person 95.1
Person 94.4
Leisure Activities 92.9
Advertisement 92.6
Poster 92.6
Clothing 87.5
Apparel 87.5
Hair 70.1
Guitar 66.7
Helmet 65.8
Performer 62.1
Guitarist 62.1

Imagga
created on 2022-02-26

person 29.1
black 25.4
outfit 25.2
sax 23.2
wind instrument 18.5
people 18.4
brass 18.4
adult 18.2
sexy 17.7
dark 17.5
guitar 17.4
musical instrument 17.3
fashion 15.8
man 14.8
musician 14.6
art 14.3
performer 13.8
religion 13.4
silhouette 13.2
music 13.2
male 12.8
warrior 12.7
style 12.6
cornet 12.5
model 12.4
portrait 12.3
lady 12.2
attractive 11.9
weapon 11.8
war 11.5
night 11.5
electric guitar 11.4
body 11.2
bass 10.8
face 10.6
clothing 10.6
studio 10.6
device 10.6
singer 10.3
dress 9.9
horn 9.9
soldier 9.8
horror 9.7
speaker 9.5
culture 9.4
vintage 9.1
human 9
stringed instrument 8.9
evil 8.8
hair 8.7
military 8.7
artist 8.7
ancient 8.6
play 8.6
elegant 8.6
instrument 8.4
entertainment 8.3
statue 8
light 8
women 7.9
gun 7.8
scary 7.7
costume 7.6
clothes 7.5
fun 7.5
gold 7.4
slim 7.4
decoration 7.3
sensual 7.3
sculpture 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.7
clothing 88.2
person 82.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Female, 100%
Happy 43.4%
Sad 23%
Calm 13.9%
Confused 5.9%
Disgusted 4.7%
Fear 4.6%
Angry 2.4%
Surprised 2.2%

AWS Rekognition

Age 21-29
Gender Female, 100%
Calm 73.8%
Sad 10.8%
Happy 6.6%
Angry 2.4%
Surprised 2.2%
Disgusted 1.8%
Fear 1.5%
Confused 0.9%

AWS Rekognition

Age 26-36
Gender Female, 99.3%
Sad 51.2%
Calm 32.8%
Confused 7%
Fear 4.1%
Disgusted 1.6%
Surprised 1.2%
Happy 1.1%
Angry 1%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Poster 92.6%
Helmet 65.8%

Text analysis

Amazon

B M Y
an

Google

BM
BM