Human Generated Data

Title

Untitled (circus performers exiting train)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8497

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus performers exiting train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.3
Human 99.3
Person 99.1
Clothing 98.2
Apparel 98.2
Person 96.6
Coat 94.1
Face 90.6
Outdoors 86.5
Nature 86.5
Grass 86.2
Plant 86.2
People 80.9
Shelter 75.4
Building 75.4
Rural 75.4
Countryside 75.4
Meal 75.4
Food 75.4
Yard 74.6
Military Uniform 73.4
Military 73.4
Female 73.3
Person 72.1
Soldier 66.3
Portrait 64.8
Photography 64.8
Photo 64.8
Suit 60.2
Overcoat 60.2
Girl 59.2
Shorts 58.9
Collage 58
Advertisement 58
Poster 58
Woman 55.3
Hat 55.2

Imagga
created on 2022-01-15

statue 23.9
person 18.3
sculpture 17.9
man 17.5
old 14.6
clothing 14.5
electric chair 14.1
art 13.5
mask 13.2
military 12.5
people 12.3
instrument of execution 11.5
instrument 11
protection 10.9
outdoor 10.7
face 10.6
male 10.6
adult 10.4
weapon 10.4
monument 10.3
device 10.1
dirty 9.9
dress 9.9
soldier 9.8
dark 9.2
traditional 9.1
danger 9.1
portrait 9.1
religion 9
television 8.6
stone 8.6
culture 8.5
costume 8.4
historic 8.2
style 8.2
lady 8.1
toxic 7.8
horror 7.8
city 7.5
outdoors 7.5
memorial 7.3
antique 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.1
person 95.7
black and white 93
clothing 86.3
outdoor 85.6
statue 71
monochrome 51.4
old 48.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 84.3%
Calm 99%
Happy 0.4%
Surprised 0.4%
Sad 0.1%
Disgusted 0.1%
Fear 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 34-42
Gender Male, 70.9%
Happy 94.1%
Surprised 2.8%
Calm 1.3%
Fear 0.7%
Sad 0.7%
Angry 0.2%
Confused 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Coat 94.1%

Captions

Microsoft

a vintage photo of a man 84%
a vintage photo of a man standing in front of a window 69.4%
a vintage photo of a man standing next to a window 68%

Text analysis

Amazon

16269
16269.

Google

16269. 16269.
16269.