Human Generated Data

Title

Untitled (three men and a woman eating in a restaurant booth)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4556

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three men and a woman eating in a restaurant booth)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.6
Person 99.6
Person 99.1
Person 98.9
Person 90.3
Helmet 87.1
Clothing 87.1
Apparel 87.1
Person 87
Person 85.5
Text 66.4
Finger 65.1
Senior Citizen 56.3
Sleeve 55.9

Imagga
created on 2022-02-05

man 41.7
office 36.9
male 35.6
people 35.2
person 34
barbershop 33.4
professional 31.8
businessman 31.8
business 31
adult 29.1
shop 28.1
meeting 27.3
table 27.1
desk 26.7
room 26.3
teacher 26.2
sitting 24.9
work 23.6
group 23.4
smiling 22.4
team 22.4
mercantile establishment 20.8
happy 20.7
computer 20.1
men 19.8
indoors 19.3
businesswoman 19.1
executive 18.9
educator 18.6
women 18.2
corporate 18.1
working 17.7
talking 17.1
suit 17.1
occupation 16.5
smile 16.4
home 16
lifestyle 15.9
classroom 15.9
indoor 15.5
businesspeople 15.2
job 15
couple 14.8
laptop 14.8
education 14.7
conference 14.7
colleagues 14.6
restaurant 14.3
interior 14.2
presentation 14
teamwork 13.9
place of business 13.9
worker 13.7
life 13.6
communication 13.4
together 13.1
looking 12.8
hairdresser 12.4
cheerful 12.2
senior 12.2
mature 12.1
patient 11.7
manager 11.2
phone 11.1
two 11
portrait 11
modern 10.5
screen 10.3
coffee 10.2
happiness 10.2
chair 10.1
technology 9.7
color 9.5
casual 9.3
hand 9.1
confident 9.1
board 9.1
handsome 8.9
success 8.9
40s 8.8
discussion 8.8
paper 8.6
workplace 8.6
tie 8.5
face 8.5
black 8.4
20s 8.3
clinic 8.2
student 8.1
coworkers 7.9
attractive 7.7
expression 7.7
gesture 7.6
adults 7.6
horizontal 7.5
showing 7.5
holding 7.4
camera 7.4
new 7.3
family 7.1
to 7.1
glass 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.9
clothing 96
person 88.6
man 87.5
human face 63.7
black and white 57.8

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 89.4%
Calm 49.1%
Surprised 26.6%
Sad 9%
Confused 6.7%
Fear 3.8%
Disgusted 2.1%
Angry 1.5%
Happy 1.2%

AWS Rekognition

Age 16-24
Gender Male, 99.8%
Sad 81.5%
Calm 8.6%
Confused 5.8%
Disgusted 1.5%
Angry 1.1%
Surprised 1%
Happy 0.3%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Helmet 87.1%

Captions

Microsoft

a man standing in front of a window 68.8%
a group of people standing in front of a window 66.3%
a man standing next to a window 63.3%

Text analysis

Amazon

8
162
16282
MJI7 8 162
16282.
DKI
ГОИСН
J
MJI7
НАИ J
INC
НАИ

Google

16282.
76282. 16282. MJI7 3TARTIVA TOe
MJI7
TOe
76282.
3TARTIVA