Human Generated Data

Title

Chinese Women: Experimental Color

Date

c. 1941

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.80.11

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Chinese Women: Experimental Color

People

Artist: Edward Steichen, American 1879 - 1973

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Poster 98.6
Advertisement 98.6
Clothing 97.5
Apparel 97.5
Person 91.9
Human 91.9
Person 89.5
Person 88.5
Helmet 86.3
Crowd 80.1
Figurine 74.4
Costume 69.6
Leisure Activities 67.9
Robe 65.3
Fashion 65.3
Gown 63.2
Festival 59.8
Toy 59.6
Performer 55.4

Imagga
created on 2022-02-26

kimono 26.4
religion 21.5
robe 21
clothing 20.5
art 20.3
person 19.3
costume 17.8
performer 17.7
garment 16.2
colorful 15.8
culture 15.4
statue 15.3
religious 15
man 14.8
temple 14.5
people 13.9
color 12.8
traditional 12.5
entertainer 12.3
ancient 12.1
church 12
sculpture 11.8
male 11.3
travel 11.3
portrait 11
prayer 10.6
god 10.5
faith 10.5
old 10.4
icon 10.3
architecture 10.1
adult 10
face 9.9
covering 9.7
decoration 9.7
style 9.6
spiritual 9.6
hat 9.3
fashion 9
gold 9
dress 9
black 9
stage 9
holy 8.7
golden 8.6
monument 8.4
east 8.4
outfit 8.4
musical instrument 8.4
studio 8.3
dancer 8.2
detail 8
oriental 8
celebration 8
sacred 7.8
antique 7.8
play 7.7
worship 7.7
saint 7.7
war 7.7
spirituality 7.7
cathedral 7.7
ethnic 7.6
fun 7.5
city 7.5
symbol 7.4
makeup 7.3
paint 7.2
musician 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.7
clothing 90.2
cartoon 90
indoor 85.3
person 81
dance 80.3
painting 76.2
woman 64.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 100%
Happy 96.4%
Calm 0.9%
Surprised 0.8%
Angry 0.5%
Confused 0.4%
Fear 0.3%
Sad 0.3%
Disgusted 0.3%

AWS Rekognition

Age 20-28
Gender Female, 99.8%
Calm 79.4%
Happy 12.8%
Sad 4.1%
Surprised 1%
Fear 0.8%
Angry 0.7%
Disgusted 0.7%
Confused 0.4%

AWS Rekognition

Age 28-38
Gender Female, 53.3%
Calm 67%
Sad 22.9%
Confused 2.9%
Happy 1.8%
Fear 1.7%
Surprised 1.5%
Angry 1.4%
Disgusted 0.8%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Poster 98.6%
Person 91.9%
Helmet 86.3%

Captions

Microsoft

a person standing next to a graffiti covered wall 50.7%
a person standing in front of a graffiti covered wall 50.6%
a person standing in front of a graffiti wall 50.5%

Text analysis

Amazon

MY
B MY
B
ي
NO

Google

BMY
BMY 3%
3%