Human Generated Data

Title

The Clinic Stairs (publicity photograph for New York Eye, Ear and Throat Hospital) [for Vanity Fair, December 1931]

Date

1931

People

Artist: Edward Steichen, American 1879 - 1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.66

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Human Generated Data

Title

The Clinic Stairs (publicity photograph for New York Eye, Ear and Throat Hospital) [for Vanity Fair, December 1931]

People

Artist: Edward Steichen, American 1879 - 1973

Date

1931

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Edward Steichen by direction of Joanna T. Steichen and the George Eastman House, P1982.66

Copyright

© The Estate of Edward Steichen / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Indoors 100
Interior Design 100
Room 99.3
Person 99
Human 99
Audience 98.8
Crowd 98.8
Theater 97.8
Person 97.6
Person 97.2
Person 96.2
Person 92.2
Orchestra Pit 87.7
Leisure Activities 87.7
Person 86.9
Person 84
Person 81.1
Person 78.1
Cinema 78
Person 77.7
Person 77.5
Person 72.9
Person 70.8
Person 70.8
Auditorium 69.7
Hall 69.7
Person 69.4
Person 66.1
Person 64
People 61.7
Hardhat 60.6
Clothing 60.6
Helmet 60.6
Apparel 60.6
Person 51.8

Clarifai
created on 2023-10-25

people 100
adult 99.2
music 99
group 98.8
many 98.8
man 98.7
musician 97.7
woman 97.6
concert 96.3
group together 96
art 95.2
crowd 94.4
stage 94.3
audience 94
singer 91.8
performance 91
instrument 90.7
one 90.2
leader 89.2
portrait 89.1

Imagga
created on 2021-12-15

black 32.6
musical instrument 20.2
wind instrument 17.2
brass 16.1
night 14.2
lamp 13.4
dark 13.3
art 13.3
silhouette 13.2
person 13.2
spotlight 13.2
cornet 13
stringed instrument 11.5
sax 11.4
device 10.9
fashion 10.5
one 10.4
elegant 10.3
symbol 10.1
water 10
style 9.6
body 9.6
light 9.4
man 9.4
glass 9.3
old 9
source of illumination 9
party 8.6
dance 8.5
action 8.5
adult 8.4
people 7.8
male 7.8
modern 7.7
grunge 7.7
elegance 7.5
music 7.4
gold 7.4
decoration 7.4
sensuality 7.3
metal 7.2
sexy 7.2
celebration 7.2
religion 7.2

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.5
concert 90.3
person 90.2
musical instrument 75.6
music 73.7
black and white 72.8
player 72.7
watching 47.8
image 35.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 3-9
Gender Male, 93.2%
Calm 47.6%
Fear 30.9%
Surprised 10.3%
Sad 4.1%
Happy 3%
Angry 2.4%
Disgusted 1%
Confused 0.7%

AWS Rekognition

Age 27-43
Gender Female, 88%
Calm 97.5%
Sad 0.9%
Surprised 0.6%
Fear 0.3%
Angry 0.2%
Confused 0.2%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 16-28
Gender Female, 96%
Sad 64.6%
Calm 32.2%
Fear 1.2%
Confused 0.9%
Angry 0.6%
Surprised 0.2%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 23-35
Gender Female, 77.9%
Calm 67.6%
Fear 21.8%
Happy 4.2%
Surprised 1.9%
Sad 1.6%
Angry 1.1%
Disgusted 1%
Confused 0.8%

AWS Rekognition

Age 19-31
Gender Male, 93%
Calm 73.1%
Sad 23%
Confused 1%
Angry 0.9%
Fear 0.9%
Happy 0.5%
Disgusted 0.4%
Surprised 0.2%

AWS Rekognition

Age 22-34
Gender Male, 83.8%
Sad 75.9%
Fear 10.1%
Calm 8.7%
Angry 3.3%
Surprised 1.2%
Confused 0.5%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 16-28
Gender Female, 57.9%
Calm 97.1%
Sad 1.1%
Angry 0.6%
Surprised 0.4%
Happy 0.3%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 25-39
Gender Female, 59.9%
Calm 71.9%
Sad 26.2%
Fear 1.1%
Angry 0.3%
Confused 0.2%
Surprised 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-37
Gender Male, 70.7%
Sad 58.8%
Calm 39.1%
Happy 1.2%
Confused 0.3%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 13-25
Gender Female, 72%
Calm 81.8%
Sad 11.9%
Fear 2.8%
Angry 1.8%
Surprised 0.7%
Happy 0.5%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 24-38
Gender Female, 90.1%
Calm 67.5%
Sad 30.9%
Fear 0.6%
Confused 0.3%
Happy 0.3%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 13-23
Gender Male, 98%
Calm 43.5%
Surprised 18.4%
Fear 14.4%
Sad 12.5%
Confused 9.8%
Angry 0.9%
Disgusted 0.4%
Happy 0.2%

AWS Rekognition

Age 25-39
Gender Male, 55.5%
Sad 46.3%
Calm 38.4%
Confused 7.5%
Surprised 3%
Angry 1.7%
Disgusted 1.1%
Fear 1.1%
Happy 1%

AWS Rekognition

Age 19-31
Gender Female, 80.1%
Fear 82.5%
Sad 8.2%
Calm 7.2%
Angry 0.7%
Happy 0.6%
Surprised 0.5%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 26-40
Gender Female, 98.8%
Calm 85.1%
Sad 5.2%
Fear 4.7%
Happy 2.2%
Surprised 0.8%
Confused 0.8%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 29-45
Gender Female, 67.8%
Sad 93.2%
Calm 4.1%
Happy 1.6%
Fear 0.6%
Confused 0.3%
Surprised 0.1%
Angry 0.1%
Disgusted 0%

AWS Rekognition

Age 27-43
Gender Female, 88.6%
Calm 54.3%
Angry 29.1%
Disgusted 6.6%
Sad 3.1%
Happy 2.9%
Surprised 1.9%
Fear 1.1%
Confused 1%

Feature analysis

Amazon

Person 99%

Text analysis

Amazon

925-6

Google

925
925