Human Generated Data

Title

Chinese Women: Experimental Color

Date

c. 1941

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.80.10

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Chinese Women: Experimental Color

People

Artist: Edward Steichen, American 1879 - 1973

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 97
Person 97
Person 96.7
Person 95.6
Musical Instrument 94.6
Musician 94.6
Leisure Activities 93.5
Advertisement 92.9
Poster 92.9
Apparel 73.1
Clothing 73.1
Guitar 71.6
Guitarist 71.6
Performer 71.6
Helmet 70
Crowd 63.5
Person 62.3
Figurine 56.9
Festival 56.7
Dance Pose 56.4
Stage 55.7

Imagga
created on 2022-02-26

person 31
outfit 26
performer 23.9
black 22.3
musician 20.6
people 20.1
art 19.7
sexy 19.3
fashion 18.8
singer 17.6
costume 17
man 16.8
guitar 15.3
religion 15.2
entertainer 14.2
male 14.2
model 14
adult 13.8
lady 13.8
dark 13.4
style 13.3
attractive 13.3
warrior 12.7
portrait 12.3
face 12.1
culture 12
silhouette 11.6
studio 11.4
weapon 11.1
music 11.1
soldier 10.7
clothing 10.3
statue 10.1
instrument 9.7
artist 9.6
war 9.6
uniform 9.6
sax 9.6
ancient 9.5
hair 9.5
clothes 9.4
dancer 9.3
slim 9.2
entertainment 9.2
sensual 9.1
color 8.9
night 8.9
decoration 8.9
conceptual 8.8
body 8.8
military 8.7
design 8.6
hairstyle 8.6
elegant 8.6
comic book 8.5
sculpture 8.5
power 8.4
traditional 8.3
brass 8.2
fun 8.2
gold 8.2
make 8.2
electric guitar 8.1
musical instrument 8.1
posing 8
supporter 7.9
guard 7.8
play 7.8
emotional 7.8
scary 7.7
pretty 7.7
hot 7.5
religious 7.5
holding 7.4
light 7.3
playing 7.3
paint 7.2
game 7.1
icon 7.1
player 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.8
indoor 85.3
clothing 80.4
person 72.3
dance 66.6
fashion accessory 65.1
posing 49.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Female, 99.9%
Happy 81.5%
Calm 7.6%
Sad 4.3%
Surprised 1.6%
Angry 1.3%
Disgusted 1.3%
Confused 1.3%
Fear 1.2%

AWS Rekognition

Age 21-29
Gender Female, 100%
Calm 78.1%
Happy 11.4%
Sad 6.2%
Surprised 1.4%
Angry 1%
Disgusted 0.8%
Fear 0.6%
Confused 0.5%

AWS Rekognition

Age 26-36
Gender Female, 64.6%
Calm 76.6%
Sad 12.1%
Confused 3.9%
Happy 2.1%
Surprised 1.7%
Disgusted 1.5%
Fear 1.1%
Angry 1%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 97%
Poster 92.9%
Helmet 70%

Captions

Microsoft

Tura Satana et al. posing for the camera 66.7%
Tura Satana et al. posing for a picture 66.6%
Tura Satana et al. posing for a photo 57.2%

Text analysis

Amazon

3
BMY
CC

Google

BMY
BMY