Human Generated Data

Title

Untitled (woman displaying Tupperware)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8900

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman displaying Tupperware)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 98.9
Human 98.9
Clothing 89.6
Apparel 89.6
Face 86.4
Furniture 81
Female 80.4
Chair 77.4
Table 73.5
Photo 70.2
Photography 70.2
Portrait 70.2
Girl 68.6
Sleeve 67.4
Pottery 59.4
Coffee Cup 58.8
Cup 58.8
Indoors 57.8
Dining Table 55.7
Art 55.6
Drawing 55.6
Jar 55.2

Imagga
created on 2022-01-15

laptop 31.3
people 29.6
person 29.5
business 29.1
office 26.5
computer 24.9
male 24.9
man 24.4
adult 24.2
home 24.1
indoors 23.7
smiling 22.4
professional 21.6
happy 20
businesswoman 20
work 19.8
businessman 19.4
working 18.6
worker 18.1
sitting 18
job 17.7
holding 17.3
corporate 17.2
room 15.6
portrait 15.5
technology 14.8
modern 14
desk 14
table 13.7
executive 13.1
cheerful 13
men 12.9
lifestyle 12.3
group 12.1
indoor 11.9
confident 11.8
smile 11.4
face 11.4
meeting 11.3
cup 11.2
manager 11.2
successful 11
house 10.9
newspaper 10.5
attractive 10.5
success 10.5
talking 10.5
businesspeople 10.4
mobile 10.4
paper 10.3
women 10.3
communication 10.1
conference 9.8
color 9.5
happiness 9.4
mature 9.3
teamwork 9.3
horizontal 9.2
suit 9.1
one 9
looking 8.8
together 8.8
education 8.7
wireless 8.6
domestic 8.6
reading 8.6
thinking 8.5
negative 8.5
keyboard 8.4
container 8.3
20s 8.2
alone 8.2
new 8.1
clothing 8.1
team 8.1
light 8
interior 8
vertical 7.9
day 7.8
using 7.7
30s 7.7
casual 7.6
glasses 7.4
phone 7.4
occupation 7.3
furniture 7.2
creation 7.1
handsome 7.1
product 7.1

Microsoft
created on 2022-01-15

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Female, 98.1%
Calm 96.8%
Sad 1.5%
Happy 1.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%
Angry 0.1%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a person standing next to a cup of coffee 31%
a person standing in front of a microwave 30.9%
a person holding a cup 30.8%

Text analysis

Amazon

SI91h

Google

91916
91916