Human Generated Data

Title

Untitled (man and woman at front door of house)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5271

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman at front door of house)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Apparel 99.7
Clothing 99.7
Human 99.2
Person 99.2
Person 98.6
Home Decor 92.7
Costume 88.7
Female 86.9
Dress 86.4
Person 83.1
Face 79.9
Plant 78.1
Nature 78.1
Outdoors 77.2
Building 76.5
Shelter 76.5
Countryside 76.5
Rural 76.5
People 75.7
Door 72.8
Housing 71.8
Overcoat 71
Coat 71
Woman 70.8
Suit 68.3
Photo 66.1
Photography 66.1
Portrait 66.1
Tree 63.3
Pottery 61.4
Potted Plant 61.4
Jar 61.4
Vase 61.4
Window 60.7
Girl 60.2
Hat 59.3

Imagga
created on 2022-01-22

barbershop 39.3
shop 30.8
man 24.9
mercantile establishment 23.7
male 20.8
person 18
place of business 15.8
people 15.6
building 14.1
hairdresser 13.3
couple 13.1
wall 12.8
dress 12.6
adult 12.1
old 11.8
world 11.7
portrait 11.6
family 11.6
groom 11.4
architecture 10.9
business 10.9
life 10.5
men 10.3
love 10.3
house 10
art 9.8
standing 9.6
ancient 9.5
happy 9.4
youth 9.4
travel 9.2
city 9.1
waiter 9.1
looking 8.8
happiness 8.6
worker 8.6
statue 8.6
smile 8.6
monument 8.4
street 8.3
tourism 8.2
outdoors 8.2
suit 8.1
romantic 8
smiling 8
businessman 7.9
establishment 7.9
together 7.9
black 7.8
antique 7.8
mother 7.6
fashion 7.5
human 7.5
dining-room attendant 7.3
employee 7.2
musical instrument 7.2
home 7.2
nurse 7.1
women 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

building 100
text 99
outdoor 97.3
clothing 87.4
brick 81
person 77.4
house 73.3
woman 69.1
dress 63.8
store 42.6
curb 8.4

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.5%
Calm 98%
Surprised 0.8%
Sad 0.7%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a person standing in front of a brick building 85.5%
a person standing in front of a brick building 85.4%
a person that is standing in front of a brick building 83.2%

Text analysis

Amazon

5439
32A8
32A8 YE33AB 930N3330
930N3330
YE33AB

Google

5439
5439