Human Generated Data

Title

Untitled (man and woman on front porch steps)

Date

1949

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10591

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman on front porch steps)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.8
Person 99.8
Person 99.7
Person 99.7
Apparel 96.7
Clothing 96.7
Shorts 85.7
Female 80.5
Face 68.6
Outdoors 67.4
Photography 64.7
Photo 64.7
Leisure Activities 63
Brick 61.8
Girl 61.7
Woman 60.7
Path 60.5
Nature 60.3
Hug 59.2
Shoe 55.7
Footwear 55.7

Imagga
created on 2022-01-09

man 28.9
people 25.6
male 25.5
adult 22.2
person 19.9
wind instrument 19.8
business 18.8
brass 18.7
musical instrument 18.4
black 18.1
businessman 16.8
sitting 15.5
silhouette 14.9
sax 13.9
office 13.3
men 12.9
world 12.8
photographer 12.2
couple 12.2
building 11.6
passenger 11.2
youth 11.1
alone 11
lifestyle 10.8
travel 10.6
suit 10.5
corporate 10.3
love 10.3
happy 10
city 10
working 9.7
outdoors 9.7
computer 9.6
together 9.6
happiness 9.4
outdoor 9.2
trombone 9.1
cornet 9
shadow 9
looking 8.8
women 8.7
day 8.6
life 8.3
leisure 8.3
freedom 8.2
group 8.1
light 8
professional 8
job 8
executive 7.9
urban 7.9
work 7.8
casual 7.6
statue 7.6
relax 7.6
device 7.5
dark 7.5
one 7.5
holding 7.4
symbol 7.4
back 7.3
music 7.2
stringed instrument 7.1
family 7.1
summer 7.1
indoors 7
sky 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.7
person 94.8
outdoor 94.2
man 92.4
clothing 83.1
black and white 73.3
street 50.4

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 96.5%
Calm 64.8%
Happy 31.7%
Sad 1.1%
Surprised 0.9%
Confused 0.5%
Disgusted 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Calm 70.9%
Happy 10%
Surprised 6.2%
Fear 5.6%
Sad 4.9%
Angry 1%
Disgusted 1%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a man sitting on a bench 82.9%
a man sitting in front of a building 82.8%
a man sitting on a bench in front of a building 75.3%

Text analysis

Amazon

or

Google

10
10