Human Generated Data

Title

Untitled (two male students studying at a table)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12095

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two male students studying at a table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 99.1
Furniture 92.7
Chair 91.7
Sitting 86.5
Flooring 82.2
Apparel 80.2
Clothing 80.2
Monitor 74.5
LCD Screen 74.5
Electronics 74.5
Screen 74.5
Display 74.5
Wood 73
Plywood 68.7
Table 67.9
Floor 65
Shelf 62.8
Desk 60.3
Indoors 60.1
Computer 59.9
Pc 59.9
Restaurant 58
Living Room 57.5
Room 57.5
Couch 56.9
Face 56.3
Cafeteria 56

Imagga
created on 2022-01-15

room 38.9
interior 34.5
table 33.5
home 27.9
desk 26.4
kitchen 25.3
indoors 23.7
office 23.2
modern 23.1
work 22.8
house 21.7
computer 21.5
working 21.2
furniture 21.2
people 20.6
disk jockey 20.3
chair 20
person 18.9
equipment 18.5
hospital 17.9
man 17.5
monitor 16.9
broadcaster 16.2
technology 14.8
design 14.1
inside 13.8
business 13.4
patient 12.9
lifestyle 12.3
communicator 12.2
floor 12.1
light 12
device 12
luxury 12
appliance 12
wood 11.7
window 11.4
adult 11.1
center 11
glass 10.9
stove 10.6
engineer 10.3
professional 10.3
occupation 10.1
indoor 10
male 9.9
job 9.7
medical 9.7
apartment 9.6
smiling 9.4
contemporary 9.4
machine 9.1
domestic 9
home appliance 9
oven 9
network 8.9
style 8.9
worker 8.9
decor 8.8
chairs 8.8
cooking 8.7
television 8.7
senior 8.4
elegance 8.4
health 8.3
decoration 8
shop 7.9
medicine 7.9
lamp 7.9
food 7.9
sitting 7.7
empty 7.7
comfortable 7.6
tile 7.6
horizontal 7.5
doctor 7.5
clean 7.5
data 7.3
clinic 7.3
classroom 7.2
life 7.1
steel 7.1
architecture 7
furnishing 7

Google
created on 2022-01-15

Table 94
Black 89.8
Writing desk 84.7
Chair 81.1
Art 78.1
Desk 78
Snapshot 74.3
Vintage clothing 71.5
Monochrome 71.2
Monochrome photography 69.3
Suit 68.7
Sitting 66.8
Curtain 65.9
Room 65.1
Machine 64
Lamp 62.2
Job 62.1
History 59.1
Office equipment 54.3
Font 53

Microsoft
created on 2022-01-15

text 99
man 96.6
person 89.3
indoor 89.1
black and white 81.3
clothing 72
black 67.2
computer 62.8
table 59.5

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 78.1%
Happy 68.3%
Sad 26.1%
Calm 1.7%
Surprised 1.4%
Angry 0.9%
Confused 0.8%
Disgusted 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 91.7%

Captions

Microsoft

a man sitting in front of a window 55.5%
a man standing in front of a window 55.4%
a man that is sitting in front of a window 44.4%

Text analysis

Amazon

NAVY
DARTMOUTH

Google

CUARTHOUTH NAVY
CUARTHOUTH
NAVY