Human Generated Data

Title

Untitled (woman walking up circus train stairs)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7104

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman walking up circus train stairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7104

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 100
Apparel 100
Person 97.7
Human 97.7
Dress 89.7
Train 88.5
Transportation 88.5
Vehicle 88.5
Hat 79
Sun Hat 76.3
Shorts 75.4
Female 74.3
Overcoat 67.8
Coat 67.8
Shoe 63.6
Footwear 63.6
Woman 60
Suit 57.3
Portrait 56.2
Photography 56.2
Face 56.2
Photo 56.2
Cowboy Hat 55.6

Clarifai
created on 2023-10-15

people 99.7
woman 98.5
one 98
monochrome 97.2
luggage 97.1
adult 97
transportation system 96.5
train 96
vehicle 95.9
railway 94.7
wear 94.6
indoors 94.2
airport 93.6
locomotive 92.1
man 91.3
retro 89.4
street 88.1
sit 86.6
nostalgia 86.4
two 85.7

Imagga
created on 2021-12-15

musical instrument 28.2
accordion 25.5
chair 24.5
interior 23.9
keyboard instrument 22.9
room 20.5
home 19.1
indoors 18.4
wind instrument 17.7
house 15.9
seat 15.5
furniture 14.7
adult 14.5
device 14.4
people 12.8
sitting 12
inside 12
electric chair 11.8
fashion 11.3
person 11.1
smiling 10.8
light 10.7
cheerful 10.6
attractive 10.5
happy 10
office 9.7
instrument 9.6
apartment 9.6
window 9.6
instrument of execution 9.5
luxury 9.4
male 9.2
holding 9.1
architecture 9
women 8.7
happiness 8.6
portrait 8.4
elegance 8.4
pretty 8.4
dress 8.1
lady 8.1
man 8.1
throne 8
building 7.9
smile 7.8
travel 7.7
modern 7.7
living 7.6
armchair 7.4
indoor 7.3
decoration 7.3
business 7.3
box 7.3
lifestyle 7.2
antique 7.1
family 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 95.8
black and white 92.6
person 87.8
outdoor 85.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 64.7%
Fear 33.8%
Confused 22.4%
Surprised 14.6%
Sad 10.3%
Calm 7.1%
Happy 6.2%
Disgusted 3.3%
Angry 2.3%

Feature analysis

Amazon

Person 97.7%
Train 88.5%

Categories

Text analysis

Amazon

16167..
16167.
L9191
KAOOK

Google

16167. •L9191 T6167..
16167.
•L9191
T6167..