Human Generated Data

Title

Untitled (men and women eating at dinner table)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8267

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women eating at dinner table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8267

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Chair 99.9
Furniture 99.9
Human 99.5
Person 99.5
Person 99.4
Restaurant 99.3
Person 99.2
Person 99.2
Person 99.2
Person 97.8
Person 96
Chair 94.2
Cafeteria 89.8
Sitting 89.7
Food Court 80.6
Food 80.6
Cafe 80.4
Dining Table 78.8
Table 78.8
Wood 67.7
Person 66.6
Flooring 60.3
Meal 60
Couch 59.6
Indoors 58.9
Chair 55.6
Person 43

Clarifai
created on 2023-10-25

people 99.8
group 98.6
furniture 98.1
adult 97.8
room 97.5
group together 97.4
woman 97.1
man 96.3
dining room 95
indoors 94.7
chair 93.7
table 92.9
many 92.2
education 91.7
child 91.2
sit 90
monochrome 89.1
meeting 88.2
leader 87.5
dining 86.3

Imagga
created on 2022-01-08

classroom 79.7
room 73.9
table 36.3
restaurant 35.2
people 30.1
man 28.9
office 26.6
chair 26.5
male 26.2
business 23.7
sitting 23.2
person 23.1
meeting 22.6
interior 22.1
group 21
indoors 20.2
building 19.5
businessman 19.4
modern 18.9
men 18.9
smiling 18.8
communication 18.5
work 18
team 17.9
cafeteria 17.8
happy 17.5
lifestyle 17.3
education 17.3
talking 17.1
teamwork 16.7
women 16.6
student 16.2
waiter 15.6
corporate 15.5
teacher 15.4
desk 15.1
hall 15.1
together 14.9
adult 14.3
school 13.7
businesswoman 13.6
smile 13.5
structure 13.4
cheerful 13
suit 12.6
learning 12.2
worker 12.2
colleagues 11.7
professional 11.6
couple 11.3
executive 11.3
design 11.3
presentation 11.2
wine 11.1
glass 10.9
life 10.9
house 10.9
conference 10.7
employee 10.6
class 10.6
working 10.6
dining 10.5
computer 10.4
businesspeople 10.4
seat 10.4
home 10.4
portrait 10.4
manager 10.2
happiness 10.2
laptop 10.2
furniture 9.9
percussion instrument 9.9
marimba 9.8
teaching 9.7
success 9.7
drinking 9.6
dining-room attendant 9.4
coffee 9.3
eating 9.3
food 9.2
city 9.1
indoor 9.1
holding 9.1
board 9
musical instrument 9
job 8.8
students 8.8
discussion 8.8
urban 8.7
day 8.6
architecture 8.6
empty 8.6
contemporary 8.5
study 8.4
meal 8.4
eat 8.4
mature 8.4
floor 8.4
drink 8.4
color 8.3
children 8.2
kid 8
dinner 7.9
love 7.9
diverse 7.8
boy 7.8
center 7.8
diversity 7.7
studying 7.7
hotel 7.6
friends 7.5
enjoyment 7.5
place 7.4
technology 7.4
service 7.4
blackboard 7.4
inside 7.4
girls 7.3
new 7.3
lunch 7.2

Google
created on 2022-01-08

Furniture 94.9
Photograph 94.2
Table 92.7
Chair 90
Black-and-white 85.1
Style 83.9
Suit 79.9
Adaptation 79.3
Monochrome photography 74.9
Snapshot 74.3
Monochrome 74.1
Event 72.1
Font 71.6
Room 69.5
Plant 69.1
Stock photography 67.3
Sitting 67.1
Coffee table 64.9
History 64.5
Art 64.2

Microsoft
created on 2022-01-08

text 97.9
table 97.8
person 97.3
furniture 96.6
black and white 93.8
clothing 90.5
chair 90.5
man 68.5
monochrome 60.1
desk 59.9
group 56.4
room 41.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 72.2%
Calm 96.2%
Sad 3.4%
Surprised 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Happy 0%

AWS Rekognition

Age 29-39
Gender Male, 77.7%
Calm 99.3%
Sad 0.4%
Confused 0.1%
Angry 0%
Surprised 0%
Disgusted 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.9%
Person 99.5%

Categories

Text analysis

Amazon

MJI7
7791.
A7DA
MJI7 YESTAS A7DA
YESTAS

Google

7791
MJ17
2
A7JA
7791 MJ17 YT37A 2 A7JA
YT37A