Human Generated Data

Title

Chinese Women: Experimental Color

Date

c. 1941

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.80.8

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Chinese Women: Experimental Color

People

Artist: Edward Steichen, American 1879 - 1973

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Imagga
created on 2022-02-26

guitar 38.4
electric guitar 34.5
person 32.2
black 26
people 24.5
sexy 22.5
musician 21.9
stringed instrument 21.3
outfit 20.6
musical instrument 20
fashion 18.8
adult 18.8
singer 18.8
sax 17.1
style 17
performer 16.8
music 16.6
male 16.3
man 16.1
silhouette 15.7
attractive 14.7
art 14.6
studio 14.4
portrait 14.2
model 14
warrior 13.7
weapon 13.6
dark 13.4
comic book 13.3
lady 13
face 12.8
instrument 12.2
entertainment 12
war 11.5
fun 11.2
body 11.2
hair 11.1
bass 11
soldier 10.7
musical 10.5
pretty 10.5
play 10.3
wind instrument 10.1
uniform 9.9
horror 9.7
microphone 9.7
rock 9.6
party 9.5
costume 9.4
culture 9.4
youth 9.4
power 9.2
dress 9
human 9
religion 9
night 8.9
entertainer 8.9
brass 8.8
gun 8.8
army 8.8
scary 8.7
women 8.7
military 8.7
artist 8.7
design 8.6
star 8.5
action 8.3
traditional 8.3
slim 8.3
painting 8.1
posing 8
evil 7.8
ancient 7.8
hot 7.5
clothes 7.5
holding 7.4
light 7.3
playing 7.3
sensual 7.3
make 7.3
paint 7.2
color 7.2
dance 7.2
game 7.1
conceptual 7
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.8
clothing 89.6
dance 89.1
person 83.8
posing 52.1
costume design 50.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 100%
Happy 63.5%
Calm 17.8%
Sad 6.1%
Disgusted 4%
Confused 2.6%
Surprised 2.3%
Angry 1.9%
Fear 1.8%

AWS Rekognition

Age 29-39
Gender Female, 99.6%
Calm 62.5%
Sad 22.5%
Confused 8.2%
Fear 1.8%
Disgusted 1.6%
Happy 1.3%
Surprised 1.2%
Angry 1%

AWS Rekognition

Age 21-29
Gender Female, 100%
Calm 70.5%
Happy 13.3%
Sad 9.5%
Angry 2%
Surprised 1.4%
Disgusted 1.4%
Fear 1.1%
Confused 0.7%

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 35
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.2%
Poster 90.5%

Captions

Microsoft

Tura Satana standing posing for the camera 64.6%

Text analysis

Amazon

B M Y

Google

BMY
BMY