Human Generated Data

Title

Untitled (men washing in front of mirror, Harvard Hasty Pudding Club on train to Philadelphia)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4630

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men washing in front of mirror, Harvard Hasty Pudding Club on train to Philadelphia)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.5
Person 99.5
Person 98.7
Person 98
Clothing 97.2
Apparel 97.2
Person 90.4
Advertisement 88.7
Poster 88.7
Text 83.3
Face 76.4
Female 68.2
Skin 60.4
People 60.2
Sleeve 56
Hat 55.9

Imagga
created on 2021-12-14

people 34
person 33.3
man 28.9
work 26.7
businessman 25.6
adult 25.5
business 25.5
male 24.1
newspaper 24.1
professional 22.5
worker 21.4
product 20.3
office 19.3
working 18.6
portrait 17.5
creation 17.4
businesspeople 17.1
clothing 16.8
corporate 16.3
job 15.9
medical 15.9
businesswoman 15.4
hand 15.2
happy 15
coat 14.5
looking 14.4
doctor 14.1
medicine 14.1
human 13.5
team 13.4
negative 13.3
indoors 13.2
smiling 13
group 12.9
occupation 12.8
computer 12.8
student 12.7
technology 12.6
attractive 12.6
laboratory 12.5
science 12.4
film 12.1
casual 11.9
health 11.8
clinic 11.7
lifestyle 11.6
face 11.4
education 11.2
nurse 11
successful 11
room 10.8
smile 10.7
colleagues 10.7
career 10.4
manager 10.2
day 10.2
teamwork 10.2
executive 10.1
20s 10.1
indoor 10
equipment 9.9
care 9.9
cheerful 9.7
lab 9.7
hospital 9.7
men 9.4
photographic paper 9.3
jacket 9.2
pretty 9.1
case 9
associates 8.8
home 8.8
scientific 8.7
patient 8.7
bright 8.6
research 8.6
biology 8.5
meeting 8.5
lab coat 8.2
holding 8.2
scientist 7.8
desk 7.8
chemistry 7.7
white goods 7.7
test 7.7
suit 7.7
profession 7.7
formal 7.6
one 7.5
alone 7.3
laptop 7.3
confident 7.3
success 7.2
color 7.2
women 7.1
table 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.8
drawing 98.7
person 97.6
sketch 97.3
black and white 86.2
cartoon 70.6
clothing 68.8
man 52.6
preparing 45.6

Face analysis

Amazon

Google

AWS Rekognition

Age 39-57
Gender Male, 95.2%
Calm 80.5%
Sad 17.3%
Confused 0.9%
Surprised 0.6%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 46-64
Gender Male, 52.5%
Calm 68.3%
Happy 19.7%
Sad 11%
Confused 0.4%
Angry 0.3%
Surprised 0.2%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 27-43
Gender Male, 91.3%
Calm 73.4%
Surprised 14.5%
Happy 8.2%
Sad 1.6%
Confused 1.2%
Angry 0.4%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 14-26
Gender Female, 65.7%
Sad 47.5%
Fear 20.4%
Happy 8.5%
Calm 7.2%
Angry 6.7%
Confused 4.4%
Surprised 2.9%
Disgusted 2.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Poster 88.7%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 43.5%
a group of people standing in front of a mirror posing for the camera 26.1%
a person standing in front of a mirror 26%

Text analysis

Amazon

OT
.18 OT NIAST
38A8
NIAST
.18
TT37A2 VY2ح 38A8
TT37A2
VY2ح