Human Generated Data

Title

Untitled (men playing cards on train)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8189

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men playing cards on train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Person 99.2
Apparel 98.9
Hat 98.9
Clothing 98.9
Accessory 78.1
Accessories 78.1
Tie 78.1
Finger 71.3
Sitting 69.8
Shirt 58.9
Art 58.2
Sleeve 57.5
Tie 54.1

Imagga
created on 2022-01-08

man 51.1
office 37
male 36.2
person 35.4
people 32.9
work 29.8
businessman 27.4
computer 27
business 26.7
sitting 25.8
working 25.6
table 24.6
senior 24.4
adult 24.2
indoors 22.8
executive 22
professional 21.9
surgeon 21.9
home 21.5
desk 20.3
laptop 19
worker 18.9
smiling 18.1
corporate 18
looking 17.6
happy 17.6
job 16.8
meeting 15.1
portrait 14.9
engineer 14.6
businesswoman 14.5
elderly 14.4
to 14.2
education 13.9
suit 13.8
specialist 13.8
men 13.7
room 13.7
glasses 13
handsome 12.5
teacher 12.3
mature 12.1
successful 11.9
workplace 11.4
patient 11.4
businesspeople 11.4
hospital 11.1
inside 11
occupation 11
paper 11
medical 10.6
retirement 10.6
success 10.5
technology 10.4
keyboard 10.3
camera 10.2
two 10.2
lifestyle 10.1
one 9.7
retired 9.7
monitor 9.7
project 9.6
reading 9.5
sit 9.5
student 9.4
manager 9.3
face 9.2
confident 9.1
employee 9
team 9
couple 8.7
architect 8.7
career 8.5
doctor 8.5
showing 8.4
attractive 8.4
group 8.1
nurse 8
smile 7.8
happiness 7.8
employment 7.7
1 7.7
industry 7.7
old 7.7
health 7.6
serious 7.6
casual 7.6
hand 7.6
college 7.6
study 7.5
grandfather 7.3
women 7.1
medicine 7

Google
created on 2022-01-08

Shirt 94.5
Coat 92.9
Hat 91.6
Fedora 90.5
Sun hat 87.8
Black-and-white 82.2
Suit 76.7
Motor vehicle 75.7
Cap 74.5
Monochrome photography 73.7
Monochrome 73.2
Vintage clothing 70
Room 69.2
White-collar worker 67.6
Event 63.7
Sitting 63.4
Art 62.1
Font 59.1
History 58.2
Photographic paper 57.6

Microsoft
created on 2022-01-08

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 98.1%
Calm 83.6%
Sad 11.6%
Surprised 2.9%
Angry 1%
Confused 0.3%
Disgusted 0.3%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 47-53
Gender Male, 89.9%
Sad 84.4%
Calm 11.4%
Confused 2%
Angry 0.8%
Happy 0.4%
Disgusted 0.4%
Surprised 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Hat 98.9%
Tie 78.1%

Text analysis

Amazon

5129
YT37A2
12A6 YT37A2
12A6

Google

5129
5129