Human Generated Data

Title

Untitled (two couples drinking at a table)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4828

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples drinking at a table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 98.3
Apparel 98.3
Chair 97.9
Furniture 97.9
Person 97.6
Human 97.6
Person 97
Restaurant 96.4
Helmet 96
Accessories 94.2
Accessory 94.2
Tie 94.2
Person 93.3
Sitting 83.3
Cafeteria 82.3
Table 79.6
Hat 76
Food 73.7
Meal 73.7
People 69.8
Female 66.4
Dining Table 61.8
Cafe 61
Sun Hat 58.3
Food Court 57.9

Imagga
created on 2022-01-29

room 35.7
interior 32.7
table 31.7
salon 29
restaurant 25
people 21.7
chair 19.8
dinner 19.6
indoors 18.4
hospital 17.9
medical 16.8
person 16.5
health 16
man 15.5
decoration 15.2
dining 15.2
home 15.1
nurse 15.1
shop 14.5
furniture 14.5
glass 14.4
drink 14.2
patient 14
service 13.9
clinic 13.8
luxury 13.7
elegant 13.7
setting 13.5
decor 13.3
medicine 13.2
party 12.9
banquet 12.6
doctor 12.2
lunch 12.1
work 12
wedding 12
reception 11.8
equipment 11.6
instrument 11.5
modern 11.2
professional 11.1
indoor 11
napkin 10.7
design 10.7
food 10.5
place 10.2
inside 10.1
male 9.9
catering 9.8
dine 9.8
working 9.7
laboratory 9.6
style 9.6
knife 9.6
fork 9.6
celebration 9.6
mercantile establishment 9.4
lifestyle 9.4
business 9.1
office 9.1
team 9
family 8.9
adult 8.9
barbershop 8.9
silverware 8.8
assistant 8.7
chemistry 8.7
fancy 8.7
iron lung 8.7
sitting 8.6
men 8.6
research 8.6
meal 8.4
wine 8.3
human 8.2
clothing 8
silver 8
life 7.9
worker 7.8
scientist 7.8
seat 7.8
lab 7.8
chemical 7.7
plate 7.7
test 7.7
formal 7.6
fine 7.6
biology 7.6
elegance 7.6
eat 7.5
contemporary 7.5
house 7.5
event 7.4
device 7.3
women 7.1
science 7.1
coat 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 93.8
table 87.8
clothing 87.1
furniture 83.9
person 76.1
woman 71.4
black and white 70.4
chair 62.6
old 51.4
dining table 6.6

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 93%
Happy 93.6%
Calm 4.5%
Sad 0.6%
Disgusted 0.3%
Angry 0.3%
Confused 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 50-58
Gender Female, 54.1%
Calm 91.3%
Sad 6.5%
Angry 0.7%
Fear 0.6%
Happy 0.3%
Confused 0.3%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 48-56
Gender Female, 53%
Calm 87.6%
Sad 7.8%
Surprised 2.8%
Confused 0.6%
Happy 0.5%
Angry 0.4%
Disgusted 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 97.9%
Person 97.6%
Helmet 96%
Tie 94.2%
Hat 76%

Captions

Microsoft

a group of people sitting at a table 84.1%
a group of people sitting around a table 84%
a group of people sitting on a table 75.8%