Human Generated Data

Title

Untitled (men and women seated around dining room table)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8382

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated around dining room table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.5
Person 99.5
Person 99.4
Person 99.1
Person 99
Person 98.9
Person 98.9
Person 98.3
Person 97.8
Room 92.8
Indoors 92.8
Clinic 90.1
Chair 89.2
Furniture 89.2
School 86.4
Classroom 86.4
Person 83.1
Table 68.7
Crowd 65.6
People 63.1
Photography 60.9
Face 60.9
Photo 60.9
Portrait 60.9
Chair 58.5
Food 56.6
Meal 56.6
Hospital 56.4
Workshop 56

Imagga
created on 2022-01-09

barbershop 54.6
shop 47.5
man 42.3
room 39.9
male 37.7
classroom 35.6
mercantile establishment 34.8
people 32.9
person 31.6
adult 23.9
businessman 23.8
professional 23.5
place of business 23.2
business 23.1
patient 22
indoors 21.1
office 20.7
smiling 19.5
clinic 19.5
men 18.9
nurse 18.8
happy 18.8
teacher 18
meeting 17.9
working 17.7
medical 17.6
colleagues 17.5
education 17.3
hospital 17.1
mature 15.8
work 15.7
sitting 15.5
businesspeople 15.2
desk 15.1
table 14.7
indoor 14.6
businesswoman 14.5
team 14.3
portrait 14.2
job 14.1
doctor 14.1
group 13.7
40s 13.6
interior 13.3
teamwork 13
corporate 12.9
20s 12.8
women 12.6
computer 12
home 12
color 11.7
class 11.6
establishment 11.5
talking 11.4
senior 11.2
worker 11
casual 11
day 11
board 10.8
lifestyle 10.8
discussion 10.7
two people 10.7
smile 10.7
together 10.5
standing 10.4
looking 10.4
occupation 10.1
communication 10.1
care 9.9
center 9.8
cheerful 9.8
health 9.7
couple 9.6
specialist 9.4
chair 9.4
executive 9.3
school 9.2
horizontal 9.2
educator 8.9
to 8.8
associates 8.8
30s 8.7
modern 8.4
case 8.3
camera 8.3
laptop 8.2
suit 8.1
building 8.1
restaurant 8
coworkers 7.9
conference 7.8
student 7.8
listening 7.7
illness 7.6
two 7.6
adults 7.6
learning 7.5
technology 7.4
family 7.1
medicine 7

Google
created on 2022-01-09

Shirt 93.9
Black 89.6
Table 86.7
Black-and-white 84.9
Window 83.9
Style 83.8
Chair 81.4
Monochrome 77.1
Monochrome photography 76.8
Snapshot 74.3
Event 72.9
T-shirt 72.4
Suit 71.7
Food 69.2
Cooking 69
Vintage clothing 68.9
Room 68.9
History 65.9
Tableware 64.9
Team 64.6

Microsoft
created on 2022-01-09

text 99.5
person 99.5
clothing 98.1
man 96.6
group 90.7
people 80.5
table 78.2
food 53.8
crowd 0.9

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 99.7%
Calm 42.9%
Happy 33.9%
Surprised 18.6%
Confused 1.5%
Sad 1.3%
Disgusted 1.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Female, 76.8%
Calm 98.3%
Happy 1.1%
Surprised 0.4%
Sad 0.1%
Angry 0.1%
Disgusted 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Happy 52.2%
Calm 26.3%
Sad 16.2%
Confused 2.5%
Disgusted 1.2%
Angry 0.9%
Surprised 0.4%
Fear 0.2%

AWS Rekognition

Age 51-59
Gender Male, 100%
Calm 57.5%
Happy 22%
Sad 10.7%
Confused 3.7%
Surprised 2.4%
Disgusted 2.3%
Fear 0.7%
Angry 0.7%

AWS Rekognition

Age 47-53
Gender Male, 96.4%
Surprised 45.4%
Calm 36.7%
Confused 6.3%
Happy 5.4%
Disgusted 3.1%
Sad 1.3%
Fear 0.9%
Angry 0.9%

AWS Rekognition

Age 47-53
Gender Male, 95.2%
Calm 68.7%
Sad 10.1%
Happy 9.6%
Surprised 7.1%
Confused 2.3%
Fear 1%
Disgusted 0.9%
Angry 0.5%

AWS Rekognition

Age 40-48
Gender Female, 99.9%
Calm 89.3%
Happy 7%
Surprised 1.8%
Sad 1.4%
Disgusted 0.2%
Fear 0.1%
Confused 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 89.2%

Captions

Microsoft

a group of people sitting at a table 93.6%
a group of people sitting around a table 93.5%
a group of people standing around a table 93.4%

Text analysis

Amazon

12233
H

Google

12233 • 12 233·
12
233·
12233