Human Generated Data

Title

Chinese Women: Experimental Color

Date

c. 1941

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.80.4

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Chinese Women: Experimental Color

People

Artist: Edward Steichen, American 1879 - 1973

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Apparel 99
Clothing 99
Helmet 96.2
Person 95
Human 95
Person 92.1
Person 88
Fashion 76.9
Robe 76.9
Gown 76.4
Poster 74
Advertisement 74
Leisure Activities 73.9
Crowd 68.3
Figurine 62.3
Stage 58.5
Kimono 58.4
Festival 57

Imagga
created on 2022-02-26

automaton 35.1
person 25.2
man 20.8
performer 19.8
art 18.7
comic book 18.2
male 15.6
people 14.5
black 14.4
fun 13.5
costume 13
play 12.9
culture 12.8
adult 12.7
religion 12.5
entertainer 12.2
dark 11.7
singer 11.6
clothing 11.2
sexy 11.2
musician 11.1
music 10.9
silhouette 10.7
colorful 10.7
ancient 10.4
color 10
guitar 9.9
fashion 9.8
warrior 9.8
style 9.6
dancer 9.5
face 9.2
entertainment 9.2
traditional 9.1
attractive 9.1
decoration 8.9
night 8.9
soldier 8.8
celebration 8.8
party 8.6
studio 8.3
paint 8.1
outfit 8
happiness 7.8
model 7.8
artist 7.7
war 7.7
player 7.7
power 7.5
happy 7.5
religious 7.5
leisure 7.5
print media 7.3
body 7.2
game 7.1
women 7.1
portrait 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.5
indoor 88.8
clothing 88.2
person 82.1
dance 82
posing 50
picture frame 7.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Female, 99.6%
Calm 79.4%
Happy 15.9%
Sad 1.4%
Surprised 1.2%
Angry 0.6%
Disgusted 0.6%
Fear 0.5%
Confused 0.5%

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 98.8%
Calm 0.4%
Surprised 0.2%
Angry 0.2%
Confused 0.2%
Disgusted 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Female, 96.4%
Calm 60.9%
Sad 27.6%
Confused 3.3%
Happy 2.4%
Surprised 2.2%
Disgusted 1.6%
Angry 1.1%
Fear 0.9%

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 38
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Helmet 96.2%
Person 95%
Poster 74%

Captions

Microsoft

a group of people posing for the camera 71.8%
a group of people posing for a photo 64.4%
a group of people posing for a picture 64.3%

Text analysis

Amazon

MY
В MY
120
3
В
are

Google

B.MYIIII|| OVA
OVA
B.MYIIII||