Human Generated Data

Title

Untitled (men seated around a table, Cloister Inn Club Dinner, Haverford, PA)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8285

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men seated around a table, Cloister Inn Club Dinner, Haverford, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.5
Person 99.5
Person 99.4
Person 98.8
Person 98.8
Clinic 98.7
Person 98.4
Person 97.6
Person 96.6
Person 94
Hospital 92.4
Operating Theatre 84.8
Indoors 78.3
Room 78.3
Person 60.8
Doctor 56

Imagga
created on 2022-01-08

person 40.4
man 32.2
people 31.8
barbershop 31.2
nurse 29.5
male 28.3
patient 27.7
home 26.3
adult 24.5
shop 24.3
room 23.7
men 21.5
teacher 20.4
happy 19.4
mercantile establishment 19.4
classroom 18.7
couple 18.3
professional 18.1
businessman 17.6
indoors 17.6
case 16.5
business 16.4
office 16.1
sick person 15.6
smiling 15.2
education 14.7
table 14.7
sitting 14.6
group 14.5
world 13.8
senior 13.1
place of business 12.9
smile 12.8
school 12.8
women 12.6
work 12.5
happiness 12.5
talking 12.3
student 12.3
together 12.3
casual 11.9
board 11.7
team 11.6
family 11.6
working 11.5
holding 10.7
teaching 10.7
worker 10.7
job 10.6
interior 10.6
class 10.6
cheerful 10.6
boy 10.4
portrait 10.3
child 10.3
indoor 10
modern 9.8
30s 9.6
black 9.6
executive 9.4
teamwork 9.3
communication 9.2
house 9.2
businesswoman 9.1
clothing 9.1
computer 8.8
casual clothing 8.8
desk 8.5
meeting 8.5
hospital 8.4
horizontal 8.4
blackboard 8.4
mother 8.2
handsome 8
kin 8
chair 8
kid 8
lifestyle 7.9
medical 7.9
students 7.8
colleagues 7.8
studying 7.7
hand 7.6
specialist 7.6
businesspeople 7.6
enjoying 7.6
human 7.5
inside 7.4
life 7.3
laptop 7.3
color 7.2
looking 7.2
day 7.1

Google
created on 2022-01-08

Black 89.6
Window 79.8
Font 74.7
Monochrome photography 68.9
Monochrome 68.5
Team 68.2
Art 68
Event 68
Room 65.4
Sitting 64.3
History 63.5
Photo caption 59.9
Collaboration 50.1

Microsoft
created on 2022-01-08

person 96.6
text 89.8
window 85.6
human face 70.8
man 69.2
clothing 64.3
table 53.9

Face analysis

Amazon

AWS Rekognition

Age 49-57
Gender Male, 93.5%
Sad 42.8%
Calm 20.4%
Happy 19.3%
Angry 5.1%
Confused 4%
Disgusted 3.9%
Surprised 2.5%
Fear 1.9%

AWS Rekognition

Age 45-53
Gender Male, 99.8%
Sad 98.7%
Calm 0.5%
Confused 0.3%
Happy 0.2%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Surprised 0%

AWS Rekognition

Age 24-34
Gender Male, 98.2%
Calm 95.6%
Sad 3.8%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 97.8%
Happy 1%
Sad 0.5%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 95.3%
Calm 99.7%
Sad 0.1%
Confused 0.1%
Surprised 0%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 47-53
Gender Female, 78.2%
Sad 68.1%
Happy 8.5%
Calm 8.5%
Confused 8.3%
Surprised 2.5%
Disgusted 1.7%
Fear 1.5%
Angry 1%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a group of people sitting at a table in front of a window 86.2%
a group of people sitting in front of a window 83.4%
a group of people standing in front of a window 83.3%

Text analysis

Amazon

3/29/39
9530.
9530
-
AZOA

Google

3|29|39.
9530.
9530. 9530. 3|29|39.