Human Generated Data

Title

Chinese Women: Experimental Color

Date

c. 1941

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.80.7

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

Chinese Women: Experimental Color

People

Artist: Edward Steichen, American 1879 - 1973

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Poster 99.3
Advertisement 99.3
Person 96.7
Human 96.7
Person 92.6
Person 91.5
Clothing 88.6
Apparel 88.6
People 73
Sleeve 69.7
Long Sleeve 69.7
Musician 67.1
Musical Instrument 67.1
Leisure Activities 63.3
Helmet 62.9
Coat 59.8
Overcoat 59.8
Figurine 56.5

Imagga
created on 2022-02-26

person 36.2
performer 34
black 23.6
musician 23.4
outfit 22.8
singer 22.4
people 21.2
entertainer 20.1
costume 19.3
fashion 18.8
man 18.8
sexy 17.7
adult 17.1
dancer 16.8
dark 16.7
male 16.3
art 15.2
style 14.8
model 14.8
attractive 14
studio 13.7
guitar 13.7
face 12.8
music 12.8
lady 12.2
comic book 12.1
fun 12
clothing 11.8
portrait 11.6
human 11.2
play 11.2
silhouette 10.8
warrior 10.7
night 10.7
party 10.3
hair 10.3
sensual 10
instrument 9.8
war 9.6
women 9.5
color 9.5
colorful 9.3
entertainment 9.2
holding 9.1
weapon 9.1
dress 9
uniform 8.8
artist 8.7
culture 8.5
expression 8.5
dance 8.5
clothes 8.4
pretty 8.4
slim 8.3
playing 8.2
religion 8.1
light 8
body 8
design 8
celebration 8
player 7.9
happiness 7.8
soldier 7.8
rock 7.8
horror 7.8
star 7.7
performance 7.7
hairstyle 7.6
power 7.6
traditional 7.5
leisure 7.5
one 7.5
automaton 7.4
emotion 7.4
lifestyle 7.2
conceptual 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.8
cartoon 95.1
clothing 93.4
person 88.2
indoor 87.1
electronics 77.1
display 50.3
posing 36.8
computer 33.3
picture frame 6.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 87%
Calm 7.8%
Angry 1.4%
Surprised 1.1%
Disgusted 0.9%
Fear 0.6%
Sad 0.6%
Confused 0.5%

AWS Rekognition

Age 20-28
Gender Female, 99.8%
Calm 77.7%
Happy 17%
Sad 2.2%
Surprised 0.9%
Disgusted 0.6%
Angry 0.6%
Confused 0.5%
Fear 0.5%

AWS Rekognition

Age 30-40
Gender Female, 94.9%
Calm 75.4%
Sad 16.8%
Confused 2%
Surprised 1.4%
Happy 1.3%
Fear 1.2%
Disgusted 1.1%
Angry 0.8%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Poster 99.3%
Person 96.7%
Helmet 62.9%

Captions

Microsoft

a person standing in front of a television 66.5%
a person standing in front of a television 59.8%
a group of people standing in front of a television 59.4%

Text analysis

Amazon

MY
3
В MY
В

Google

N
!!
BMY N !!
BMY