Human Generated Data

Title

Untitled (man and woman in front of Heinz display)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4464

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman in front of Heinz display)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 98.5
Person 91.4
Clinic 78.8
Clothing 74.5
Apparel 74.5
Person 71.4
Person 69.8
Doctor 59.6

Imagga
created on 2022-01-23

businessman 36.2
business 35.2
man 30.9
male 27.7
person 26.9
corporate 24
people 23.4
team 20.6
work 20.4
teacher 20.2
professional 19.6
adult 18.8
office 18.6
job 18.6
success 18.5
education 18.2
blackboard 17.4
executive 17.3
men 17.2
group 16.9
looking 16.8
laptop 15.7
computer 15.3
school 15.3
communication 15.1
teamwork 14.8
board 14.6
human 14.2
meeting 14.1
technology 14.1
classroom 14.1
businesswoman 13.6
class 13.5
employee 13.4
chart 13.4
happy 13.2
computerized axial tomography scanner 12.4
working 12.4
smiling 12.3
drawing 12.1
black 12
hand 11.9
modern 11.9
finance 11.8
entrepreneur 11.6
financial 11.6
world 11.2
suit 10.8
businessmen 10.7
sitting 10.3
room 10.3
manager 10.2
presentation 10.2
student 10.2
successful 10.1
diagram 9.7
boss 9.6
businesspeople 9.5
company 9.3
occupation 9.2
map 8.9
educator 8.9
x-ray machine 8.8
graph 8.6
portrait 8.4
holding 8.2
confident 8.2
to 8
women 7.9
brainstorming 7.9
design 7.9
hands 7.8
teaching 7.8
chalk 7.8
colleagues 7.8
desk 7.8
casual 7.6
two 7.6
united 7.6
talking 7.6
economy 7.4
globe 7.4
global 7.3
worker 7.1
smile 7.1
idea 7.1
table 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 96.9
man 94.1
window 92.8
person 92.3
clothing 90.5
cartoon 83
music 82.2
banjo 27

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 98%
Calm 78.2%
Happy 20.8%
Sad 0.2%
Surprised 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing in front of a window 56.8%
a man standing in front of a store window 40.3%
a man that is standing in front of a window 40.2%

Text analysis

Amazon

NZ
17296.
KOBVY
2VLE1X
DO
SMART

Google

7296
17296
NZ
17296 TAND NZ 7296
TAND