Human Generated Data

Title

Untitled (waitress taking the order of three women in a booth at a diner)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4891

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (waitress taking the order of three women in a booth at a diner)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.5
Person 99.5
Person 99.4
Person 99.4
Person 99.2
Person 99.1
Indoors 92.7
Interior Design 92.7
Crowd 68.7
Art 68.5
People 63.3
Restaurant 57.9
Apparel 56
Clothing 56
Dating 55.1

Imagga
created on 2022-01-23

musical instrument 39.7
wind instrument 34.2
sax 33.3
stage 29.9
man 29.6
person 25
accordion 23.6
male 23.4
people 21.2
adult 19.9
keyboard instrument 19.7
platform 18.7
businessman 18.5
business 18.2
group 16.9
musician 15.2
job 14.2
silhouette 14.1
teacher 13.8
classroom 13
music 12.7
black 12.6
rock 12.2
blackboard 11.9
laptop 11.8
education 11.3
men 11.2
student 10.9
sky 10.8
singer 10.8
team 10.7
concert 10.7
professional 10.6
class 10.6
chair 10.6
computer 10.4
youth 10.2
work 10.2
brass 10.1
school 9.9
modern 9.8
success 9.7
musical 9.6
room 9.2
board 9
oboe 9
performer 8.9
technology 8.9
bass 8.8
working 8.8
device 8.8
symbol 8.8
guitar 8.6
player 8.6
life 8.2
executive 7.9
boy 7.8
students 7.8
portrait 7.8
sitting 7.7
performance 7.7
studio 7.6
sign 7.5
happy 7.5
style 7.4
office 7.2
lifestyle 7.2
copy space 7.2
art 7.2
women 7.1
night 7.1
cornet 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.2
person 93
clothing 89.1
drawing 87.9
old 79.4
black and white 79.3
man 71.2
cartoon 70.2

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 98.5%
Happy 76.3%
Surprised 11.8%
Fear 3.4%
Disgusted 2.8%
Confused 2.5%
Sad 1.1%
Calm 1%
Angry 0.9%

AWS Rekognition

Age 28-38
Gender Male, 97.1%
Surprised 88.3%
Calm 7.6%
Fear 1.7%
Disgusted 0.8%
Angry 0.7%
Confused 0.4%
Happy 0.4%
Sad 0.2%

AWS Rekognition

Age 35-43
Gender Male, 99.7%
Calm 99%
Surprised 0.5%
Sad 0.2%
Happy 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of a group of people around each other 79.8%
a vintage photo of a person 79.7%
a vintage photo of a group of people 79.6%

Text analysis

Amazon

H16008.
H16008
form

Google

HI6008. H16008.
HI6008.
H16008.