Human Generated Data

Title

Untitled (members of class of 1917 socializing under tent, Pinceton, NJ)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7479

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (members of class of 1917 socializing under tent, Pinceton, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.1
Person 99.1
Person 98.9
Person 98.8
Hat 97.6
Apparel 97.6
Clothing 97.6
Person 96.6
Person 91.8
Hat 84
Accessories 81.6
Accessory 81.6
Sunglasses 81.6
Tie 77.3
Face 77.2
Finger 76.4
Crowd 74.4
People 73.1
Text 69.6
Person 64.6
Head 63.8
Overcoat 60.6
Coat 60.6
Suit 60.1
Musical Instrument 56.6
Musician 56.6
Sleeve 56.3
Person 49.7

Imagga
created on 2022-01-09

man 48.4
surgeon 37.7
person 36.5
male 36.2
people 29.5
senior 23.4
patient 22.2
adult 21.9
portrait 19.4
men 18.9
professional 18.3
work 18
medical 17.7
worker 16.9
elderly 16.3
doctor 16
job 15.9
nurse 15.9
happy 15
hospital 14.8
occupation 14.7
working 14.1
old 13.9
equipment 12.8
home 12.8
business 12.7
retired 12.6
health 12.5
indoors 12.3
mask 11.9
surgery 11.7
room 11.5
hand 11.4
case 11.4
sitting 11.2
glasses 11.1
medicine 10.6
retirement 10.6
grandfather 10.5
illness 10.5
mature 10.2
smiling 10.1
face 9.9
clinic 9.9
handsome 9.8
businessman 9.7
looking 9.6
specialist 9.5
sick person 9.5
happiness 9.4
casual 9.3
care 9.1
human 9
pensioner 9
profession 8.6
industry 8.5
hat 8.5
teacher 8.3
office 8.2
team 8.1
to 8
operation 7.9
uniform 7.8
education 7.8
emergency 7.7
clothing 7.6
adults 7.6
camera 7.4
industrial 7.3
lifestyle 7.2
helmet 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98.9
text 98.5
clothing 91.9
man 85.8
hat 79.1
helmet 73.5
human face 71.1
black and white 68
fashion accessory 64.5

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 99.2%
Happy 63.9%
Confused 12.5%
Surprised 12.4%
Sad 4.3%
Disgusted 2.9%
Fear 2.6%
Angry 1%
Calm 0.5%

AWS Rekognition

Age 42-50
Gender Male, 87.6%
Calm 96.3%
Confused 1.2%
Happy 0.7%
Sad 0.6%
Surprised 0.5%
Disgusted 0.5%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 49-57
Gender Male, 74.4%
Surprised 65.4%
Angry 19.6%
Confused 5%
Happy 3.3%
Calm 2.3%
Sad 2%
Fear 1.2%
Disgusted 1.1%

AWS Rekognition

Age 48-54
Gender Male, 98.9%
Calm 79.4%
Happy 6.1%
Sad 4.3%
Surprised 3.4%
Confused 2.3%
Disgusted 2.1%
Fear 1.4%
Angry 1.1%

AWS Rekognition

Age 47-53
Gender Male, 98.8%
Calm 99.8%
Sad 0.1%
Happy 0%
Angry 0%
Surprised 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Female, 87.1%
Calm 50.4%
Fear 37.7%
Sad 4.3%
Happy 3%
Surprised 1.6%
Confused 1.2%
Disgusted 1.1%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Hat 97.6%
Sunglasses 81.6%
Tie 77.3%

Captions

Microsoft

a group of people standing around each other 83.2%
a group of people posing for a photo 83.1%
a group of people posing for the camera 83%

Text analysis

Amazon

1917
32A8
850W
Y19342
8507
32A8 Y19342 83033300
83033300

Google

1917
1917 Coca
Coca