Human Generated Data

Title

Untitled (two sailors and two women sitting at restaurant table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4998

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two sailors and two women sitting at restaurant table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.8
Human 99.8
Restaurant 99.8
Person 99.6
Person 99.5
Person 99.3
Person 98.7
Cafe 95
Cafeteria 86.6
Meal 85
Food 85
Sitting 77.1
Food Court 76.5
Bar Counter 62.3
Pub 62.3
Crowd 59.1
Shop 55.8

Imagga
created on 2022-01-22

businessman 34.4
business 34
male 32.6
man 28.9
office 24.8
people 24.5
person 22.5
stall 20.2
businesswoman 20
meeting 19.8
work 19.6
newspaper 19.6
adult 19.2
job 18.6
group 18.5
laptop 18.2
corporate 18
businesspeople 18
room 17.1
computer 16.8
men 16.3
happy 16.3
colleagues 15.5
team 15.2
desk 14.9
manager 14.9
teamwork 14.8
casual 14.4
looking 14.4
product 14.3
building 13.8
smiling 13.7
success 13.7
classroom 13.6
indoors 13.2
table 12.7
finance 12.7
technology 12.6
executive 12.3
together 12.3
successful 11.9
old 11.8
lifestyle 11.5
working 11.5
plan 11.3
new 11.3
professional 11.2
senior 11.2
modern 11.2
creation 11.1
design 10.7
engineer 10.6
hand 10.6
busy 10.6
daily 10.6
color 10.6
company 10.2
day 10.2
horizontal 10
indoor 10
clothing 9.8
associates 9.8
coworkers 9.8
discussion 9.7
copy 9.7
portrait 9.7
architect 9.6
chart 9.5
females 9.4
paper 9.4
happiness 9.4
worker 9.2
camera 9.2
confident 9.1
suit 9
cheerful 8.9
conference 8.8
40s 8.7
couple 8.7
women 8.7
cooperation 8.7
mid adult 8.7
30s 8.6
architecture 8.6
customer 8.6
face 8.5
money 8.5
writing 8.5
communication 8.4
brass 8.1
wind instrument 7.9
client 7.8
designer 7.7
sitting 7.7
construction 7.7
project 7.7
architectural 7.7
center 7.6
workplace 7.6
engineering 7.6
talking 7.6
adults 7.6
speedway 7.5
holding 7.4
20s 7.3
hall 7.1

Google
created on 2022-01-22

Photograph 94.2
Black 89.8
Table 88.2
Black-and-white 87.9
Style 84.1
Line 81.7
Monochrome 80.2
Suit 79.5
Font 79.3
Monochrome photography 79.3
Eyewear 76.3
Snapshot 74.3
Event 74
Art 73.5
Automotive design 72.3
Customer 71.7
T-shirt 71.2
Publication 70
Job 66
Room 65.8

Microsoft
created on 2022-01-22

text 99.5
person 93
man 93
clothing 90.5
human face 68.3
menu 54

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.5%
Surprised 32.3%
Calm 28.2%
Happy 20%
Fear 6.9%
Sad 5.2%
Confused 4.1%
Disgusted 2.2%
Angry 1.2%

AWS Rekognition

Age 29-39
Gender Female, 91.4%
Calm 95.5%
Sad 3.7%
Angry 0.2%
Happy 0.2%
Disgusted 0.2%
Surprised 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 71.1%
Calm 99.9%
Sad 0%
Happy 0%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 42-50
Gender Female, 84.5%
Calm 49.6%
Happy 21.6%
Surprised 9.3%
Sad 5.9%
Disgusted 5.2%
Confused 4%
Angry 3.2%
Fear 1.2%

AWS Rekognition

Age 20-28
Gender Male, 68.4%
Calm 89.8%
Happy 4.4%
Disgusted 2.1%
Sad 2%
Confused 0.6%
Angry 0.5%
Fear 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people sitting in front of a building 74.1%
a group of people standing in front of a building 74%
a group of people sitting at a table 71.9%

Text analysis

Amazon

REAL
16436.
REAL PHOTOGRAPHS
PHOTOGRAPHS

Google

16
43
16436. REAL PHOTOCAA nsENTO 16 43 6.
REAL
6.
nsENTO
16436.
PHOTOCAA