Human Generated Data

Title

Untitled (alumni talking and drinking in a corner, Princeton University reunion, Princeton, NJ)

Date

c. 1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8181

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (alumni talking and drinking in a corner, Princeton University reunion, Princeton, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.3
Person 99.3
Person 99.1
Person 98.9
Furniture 98.8
Person 98.7
Person 97.8
Person 97.3
Person 95.2
Restaurant 94.2
Chair 93.4
Person 88.7
Food 85.1
Meal 85.1
Table 84.9
Apparel 83.4
Clothing 83.4
Dish 72.6
Cafeteria 69.3
Face 67.4
People 66.9
Dining Table 64.4
Indoors 64.2
Portrait 63.4
Photography 63.4
Photo 63.4
Room 61.4
Food Court 59.2
Clinic 58.5
Cafe 58
Couch 57.6
Sitting 56.5

Imagga
created on 2022-01-08

hospital 42.1
barbershop 31.9
people 29
shop 29
man 27.5
person 25.4
couple 22.6
mercantile establishment 21.6
adult 20.3
male 19.1
family 18.7
patient 17
happiness 15.7
room 15.6
place of business 14.4
love 14.2
indoors 14.1
smiling 13.7
men 13.7
groom 13.4
two 12.7
old 12.5
child 12.5
bride 12.5
life 12.3
sitting 12
home 12
happy 11.9
dress 11.7
husband 11.4
nurse 11.1
women 11.1
wedding 11
color 10.6
senior 10.3
mother 10.2
human 9.7
medical 9.7
portrait 9.7
together 9.6
wife 9.5
bouquet 9.4
doctor 9.4
case 9.3
chair 9
new 8.9
work 8.9
interior 8.8
kin 8.8
lifestyle 8.7
scene 8.7
teacher 8.6
married 8.6
togetherness 8.5
sick person 8.4
cheerful 8.1
romantic 8
to 8
business 7.9
surgery 7.8
health 7.6
loving 7.6
fashion 7.5
enjoyment 7.5
clothes 7.5
vintage 7.4
office 7.4
care 7.4
lady 7.3
professional 7.2
clothing 7.2
establishment 7.2
face 7.1
counter 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 98.6
window 91.8
clothing 73.1
people 70.6
group 65.9
preparing 64.9
cooking 27.4
meal 26.1

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Female, 97.9%
Calm 99.3%
Happy 0.6%
Confused 0%
Sad 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Female, 90.7%
Calm 99.8%
Surprised 0.1%
Sad 0%
Happy 0%
Disgusted 0%
Fear 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 24-34
Gender Female, 91.5%
Calm 93.4%
Surprised 3.5%
Sad 1.5%
Fear 0.5%
Confused 0.4%
Angry 0.2%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 48-54
Gender Male, 99.8%
Calm 58%
Sad 15.7%
Happy 8.2%
Angry 6.2%
Confused 4.5%
Disgusted 4.2%
Surprised 2.2%
Fear 0.9%

AWS Rekognition

Age 23-31
Gender Male, 99.8%
Sad 71.3%
Confused 8%
Calm 5.3%
Disgusted 4.5%
Angry 4.1%
Happy 3.3%
Surprised 2.2%
Fear 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 93.4%

Captions

Microsoft

a group of people sitting and standing in front of a window 94.7%
a group of people standing next to a window 94.6%
a group of people standing in front of a window 94.5%

Text analysis

Amazon

YT33AS
3248 YT33AS 830N3730
3248
830N3730

Google

32A8 YT3BA2 830M3a3g
YT3BA2
32A8
830M3a3g