Human Generated Data

Title

Untitled (alumni seated around a table under a tent, Princeton University reunion, Princeton, NJ)

Date

c. 1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8173

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (alumni seated around a table under a tent, Princeton University reunion, Princeton, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8173

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.2
Human 99.2
Person 99.1
Person 97.5
Person 97.5
Person 97.2
Person 96.1
Person 93
Person 91.7
Person 89.3
Person 88.9
Clinic 88.9
Person 82.4
Person 81.3
Building 73.9
Sunglasses 71.9
Accessories 71.9
Accessory 71.9
Crowd 71.2
People 68.8
Room 68.2
Indoors 68.2
Apparel 64
Clothing 64
Doctor 56.8
Person 54.8

Clarifai
created on 2023-10-25

people 99.9
many 99
adult 98.4
group 98.3
man 97.2
woman 96
group together 93.1
education 89.9
wear 86.8
child 85.7
crowd 81
uniform 78.7
medical practitioner 74.6
leader 74.2
monochrome 72.7
boxer 71.6
gown (clothing) 71
military 67.8
scientist 67.2
war 66.1

Imagga
created on 2022-01-08

people 21.2
men 17.2
man 16.8
medical 15.9
person 15.5
brass 14.9
musical instrument 14.5
worker 14.3
male 14.2
wind instrument 13.1
work 12.5
laboratory 12.5
glass 12.4
biology 12.3
business 12.1
lab 11.6
chemistry 11.6
room 11.5
research 11.4
wedding 11
equipment 10.8
chemical 10.8
scientist 10.8
scientific 10.6
world 10.5
instrument 10.5
human 10.5
technology 10.4
table 10.4
case 10.3
hospital 9.9
groom 9.9
professional 9.7
medicine 9.7
education 9.5
information 8.8
working 8.8
adult 8.6
bride 8.6
party 8.6
development 8.5
nurse 8.4
surgeon 8.3
city 8.3
team 8.1
science 8
art 7.9
coat 7.9
old 7.7
life 7.5
study 7.5
occupation 7.3
group 7.2
student 7.2
women 7.1
businessman 7.1
stage 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 93.6
outdoor 93.5
text 91.2
drawing 67.2
clothing 56.9
old 45.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 76.5%
Happy 18.3%
Sad 2.3%
Confused 1%
Surprised 0.8%
Angry 0.8%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 21-29
Gender Male, 99.5%
Calm 45%
Sad 41.9%
Angry 5.7%
Confused 2.9%
Disgusted 1.5%
Happy 1.5%
Surprised 0.8%
Fear 0.6%

AWS Rekognition

Age 31-41
Gender Male, 99%
Sad 95.4%
Confused 1.5%
Calm 1.2%
Happy 0.7%
Angry 0.5%
Disgusted 0.4%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 45-51
Gender Male, 98.4%
Happy 56.3%
Sad 17.1%
Surprised 13.6%
Confused 8.5%
Calm 2%
Angry 1.7%
Disgusted 0.6%
Fear 0.2%

AWS Rekognition

Age 54-64
Gender Female, 93.4%
Calm 56.3%
Surprised 41.9%
Angry 0.6%
Confused 0.5%
Sad 0.3%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 97.1%
Sad 59.3%
Calm 36.9%
Confused 1.3%
Disgusted 0.6%
Fear 0.6%
Angry 0.6%
Surprised 0.4%
Happy 0.4%

AWS Rekognition

Age 38-46
Gender Male, 88.9%
Calm 97.4%
Confused 1.2%
Sad 0.8%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Sad 81.4%
Confused 4.7%
Fear 3.3%
Happy 3%
Calm 2.6%
Disgusted 2.2%
Angry 1.5%
Surprised 1.2%

AWS Rekognition

Age 48-56
Gender Female, 93%
Calm 48.7%
Sad 17.6%
Happy 9.5%
Surprised 7.1%
Fear 6%
Confused 6%
Disgusted 3%
Angry 2.1%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 73.7%
Sad 13.8%
Surprised 3.4%
Happy 2.6%
Disgusted 2.5%
Confused 2.1%
Angry 1.3%
Fear 0.7%

AWS Rekognition

Age 33-41
Gender Female, 95.8%
Calm 85.9%
Disgusted 3.4%
Happy 2.6%
Angry 2.5%
Surprised 2.1%
Confused 1.7%
Sad 0.9%
Fear 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Sunglasses 71.9%

Text analysis

Amazon

32A8
32A8 YE3
YE3

Google

32A8 YT37A2 A3OM3730
32A8
YT37A2
A3OM3730