Human Generated Data

Title

Untitled (wedding guests standing on staircase)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10676

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests standing on staircase)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Banister 100
Handrail 100
Railing 99.9
Person 99.2
Human 99.2
Person 98.3
Person 97.6
Clothing 94.4
Apparel 94.4
Person 86.3
Coat 58.6
Overcoat 55.1
Staircase 50.3

Imagga
created on 2022-01-15

picket fence 100
fence 100
barrier 83.9
obstruction 55.7
structure 29.4
man 20.8
barbershop 20
people 19
architecture 18.7
old 18.1
building 16.8
shop 15.5
men 15.4
house 15
male 14.9
person 13.6
home 13.6
business 13.4
city 13.3
adult 12.9
travel 12.7
work 12.6
mercantile establishment 12.3
medical 11.5
groom 11.3
history 10.7
laboratory 10.6
couple 10.4
portrait 10.3
wall 10.3
black 10.2
coat 10.1
happy 10
lab 9.7
historic 9.2
scientist 8.8
looking 8.8
urban 8.7
love 8.7
happiness 8.6
serious 8.6
face 8.5
two 8.5
doctor 8.5
senior 8.4
modern 8.4
place of business 8.2
student 8.1
team 8.1
lifestyle 7.9
women 7.9
smile 7.8
color 7.8
column 7.7
chemistry 7.7
hospital 7.7
sky 7.6
research 7.6
biology 7.6
tourism 7.4
town 7.4
nurse 7.4
vacation 7.4
success 7.2
summer 7.1
businessman 7.1
medicine 7
indoors 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 97.1
black and white 93.3
clothing 92.2
text 79.8
man 78.6
white 71.6
baby bed 53.2
monochrome 52.3

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 98.8%
Calm 99.1%
Sad 0.5%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 98.1%
Sad 75.6%
Calm 8.4%
Happy 7.8%
Surprised 4.2%
Fear 1.5%
Confused 0.9%
Angry 0.8%
Disgusted 0.7%

AWS Rekognition

Age 52-60
Gender Male, 99.8%
Calm 99.8%
Sad 0.1%
Disgusted 0%
Surprised 0%
Angry 0%
Happy 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Staircase 50.3%

Captions

Microsoft

a group of people standing in a room 95.8%
a group of people standing in front of a window 91%
a group of people in a room 90.9%

Text analysis

Amazon

21
21303.
21 303,
303,
SI

Google

21303.
21303.