Human Generated Data

Title

Untitled (men seated at restaurant tables)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4915

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men seated at restaurant tables)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4915

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Person 99
Person 98.4
Person 97.8
Person 97.7
Person 97.6
Person 97.5
Chair 96.1
Furniture 96.1
Restaurant 93.5
Sitting 89.4
Person 85.5
Cafeteria 77.1
Worker 76.5
Pub 74.5
Bar Counter 71.9
Crowd 70.5
People 66
Meal 62.5
Food 62.5
Cafe 59.1
Chair 56.8
Clinic 55.9
Table 55.4
Person 52.6

Clarifai
created on 2023-10-27

people 99.9
group 98.5
man 98.5
adult 98.3
woman 98.2
group together 97.9
sit 94
furniture 90.5
chair 89.4
restaurant 88.6
monochrome 87.6
many 86.8
league 86.2
indoors 85.8
room 85.6
education 85.1
child 84.2
recreation 83.4
bar 83.2
administration 81.9

Imagga
created on 2022-01-23

classroom 68.5
room 62.3
male 36.9
person 35.9
man 35.6
people 34.6
group 33
businessman 32.7
business 29.8
meeting 29.2
men 26.6
office 26.5
table 26
teacher 24
adult 22.7
team 21.5
happy 20.7
teamwork 20.4
work 19.6
student 19.2
businesswoman 19.1
corporate 18.9
sitting 18.9
smiling 18.8
brass 18.3
education 18.2
professional 17.7
communication 17.6
women 17.4
executive 16.2
businesspeople 16.1
job 15.9
cornet 15.6
colleagues 15.5
class 15.4
talking 15.2
desk 15.1
together 14.9
wind instrument 14.6
chair 14.4
hall 14.4
indoors 14.1
modern 14
worker 14
successful 13.7
restaurant 13.7
blackboard 13.4
school 13.3
manager 13
cheerful 13
success 12.9
indoor 12.8
conference 12.7
couple 12.2
new 12.1
smile 12.1
board 11.8
teaching 11.7
lifestyle 11.6
interior 11.5
musical instrument 11.4
workplace 11.4
looking 11.2
confident 10.9
discussing 10.8
discussion 10.7
computer 10.4
plan 10.4
laptop 10.1
holding 9.9
suit 9.9
hand 9.9
working 9.7
portrait 9.7
technology 9.6
happiness 9.4
learning 9.4
two 9.3
presentation 9.3
company 9.3
mature 9.3
coworkers 8.8
students 8.8
employee 8.7
adults 8.5
finance 8.4
design 8.4
home 8
associates 7.9
standing 7.8
cooperation 7.7
corporation 7.7
stage 7.7
planning 7.7
studying 7.7
collar 7.7
drinking 7.7
building 7.6
human 7.5
ideas 7.5
study 7.5
planner 7.4
training 7.4
black 7.2
handsome 7.1
idea 7.1

Google
created on 2022-01-23

Furniture 93.1
Chair 90.8
Table 89.8
Black 89.6
Black-and-white 82.7
Snapshot 74.3
Monochrome photography 73.1
Monochrome 72.4
Event 72.2
Suit 70.1
Room 69.5
Vintage clothing 67.8
Font 64.3
Stock photography 62.9
History 62.7
T-shirt 60.8
Photo caption 60.7
Team 59.9
Sitting 58.1
Tableware 56

Microsoft
created on 2022-01-23

text 99.7
person 97.3
table 95.4
bottle 75.3
clothing 72.2
man 71.3
furniture 56.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 64.2%
Calm 92.5%
Angry 1.9%
Happy 1.4%
Fear 1.2%
Sad 1.1%
Surprised 1%
Disgusted 0.5%
Confused 0.4%

AWS Rekognition

Age 25-35
Gender Female, 85.9%
Happy 99.8%
Calm 0.1%
Sad 0%
Surprised 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Female, 58.1%
Calm 85.8%
Happy 7.7%
Sad 3.6%
Confused 1%
Angry 0.8%
Disgusted 0.6%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Female, 81.2%
Calm 91.9%
Happy 6.8%
Sad 0.6%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 22-30
Gender Male, 96.4%
Sad 64.2%
Calm 28.2%
Confused 2.3%
Angry 1.8%
Disgusted 1.4%
Surprised 0.8%
Happy 0.7%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 96.1%

Text analysis

Amazon

119321.
119321
KAOOX

Google

93 21. 會曾 日B 19 32 1.
93
21.
B
19
32
1.