Human Generated Data

Title

Untitled (men playing cards at table on circus train)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7107

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men playing cards at table on circus train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clinic 99.2
Human 98.9
Person 98.9
Person 98.3
Person 98.1
Person 97.4
Hospital 96.3
Operating Theatre 95.9
Person 95.5
Hat 77
Clothing 77
Apparel 77
Person 68.5
Room 67.8
Indoors 67.8
Building 56
Surgery 55.5
Doctor 55.5

Imagga
created on 2021-12-15

negative 36.1
film 33.3
photographic paper 23.5
bride 20.5
wedding 20.2
glass 18.4
celebration 18.3
dress 17.2
photographic equipment 15.7
people 15.6
adult 14.9
bouquet 14.2
person 13.8
chandelier 12.8
groom 12.2
decoration 12.1
clothing 12
love 11.8
day 11.8
fashion 11.3
party 11.2
veil 10.8
medical 10.6
human 10.5
flowers 10.4
portrait 10.4
hair 10.3
professional 10.2
romantic 9.8
bridal 9.7
table 9.7
ceremony 9.7
formal 9.5
lighting fixture 9.5
elegant 9.4
elegance 9.2
romance 8.9
lady 8.9
science 8.9
sexy 8.8
man 8.7
holiday 8.6
life 8.6
luxury 8.6
marriage 8.5
biology 8.5
flower 8.5
black 8.4
pretty 8.4
fixture 8.1
work 7.9
medicine 7.9
couple 7.8
reception 7.8
happiness 7.8
lab 7.8
chemistry 7.7
laboratory 7.7
research 7.6
development 7.6
hand 7.6
instrument 7.6
holding 7.4
business 7.3
lifestyle 7.2
lovely 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.3
clothing 90.7
person 89.1
window 83.8
black and white 83.3
man 68.5
drawing 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 31-47
Gender Female, 87.1%
Calm 65.4%
Sad 30.8%
Angry 3.1%
Confused 0.4%
Surprised 0.2%
Happy 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-42
Gender Male, 87.1%
Calm 97.5%
Sad 1.9%
Happy 0.3%
Confused 0.1%
Surprised 0.1%
Angry 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 37-55
Gender Male, 57%
Calm 64.7%
Sad 32.5%
Confused 1.1%
Surprised 1.1%
Angry 0.3%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 28-44
Gender Female, 54.7%
Calm 93%
Sad 2.4%
Disgusted 1.4%
Surprised 0.9%
Happy 0.8%
Angry 0.6%
Confused 0.6%
Fear 0.2%

AWS Rekognition

Age 25-39
Gender Male, 82.2%
Sad 98.2%
Calm 1.5%
Fear 0.2%
Confused 0.1%
Surprised 0%
Angry 0%
Happy 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Hat 77%

Captions

Microsoft

a group of people standing in front of a window 55.7%
a group of people standing next to a window 55.6%
a group of people sitting and standing in front of a window 50.1%

Text analysis

Amazon

12

Google

12
12