Human Generated Data

Title

Untitled (formally dresed man and women talking near a couch, Philadelphia, PA)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8222

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dresed man and women talking near a couch, Philadelphia, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.6
Human 98.6
Floor 98
Person 96.5
Person 96
Flooring 94.7
Dog 91.1
Pet 91.1
Canine 91.1
Mammal 91.1
Animal 91.1
Tie 87.5
Accessories 87.5
Accessory 87.5
Person 86.9
Apparel 77.2
Clothing 77.2
Face 64.7
Photography 64
Photo 64
Portrait 61.7
Indoors 59.3
Advertisement 55.5
Poster 55.5
Female 55
Girl 55

Imagga
created on 2022-01-08

man 30.4
person 30.2
weight 26
equipment 25.1
people 22.9
checker 22
barbell 21.6
city 21.6
sports equipment 19.9
plunger 18.2
adult 17.6
male 17.1
game equipment 17
men 16.3
street 15.6
hand tool 14.6
portrait 13.6
urban 13.1
building 12.8
tool 12.8
human 12
house 11.7
sitting 11.2
wall 11.1
dumbbell 11.1
black 10.8
leisure 10.8
businessman 10.6
boy 10.4
room 10.4
home 10.4
architecture 10.1
lifestyle 10.1
shop 9.9
life 9.7
business 9.7
group 9.7
together 9.6
barbershop 9.4
athlete 9.2
old 9
fashion 9
happy 8.8
women 8.7
sidewalk 8.7
crowd 8.6
casual 8.5
two 8.5
floor 8.4
exercise 8.2
working 8
world 7.9
couple 7.8
modern 7.7
walking 7.6
dark 7.5
style 7.4
light 7.3
design 7.3
alone 7.3
dress 7.2
romance 7.1
handsome 7.1
romantic 7.1
worker 7.1
work 7.1
hand 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

floor 97
black and white 85.5
indoor 85.2
clothing 80.8
white 74.5
text 71.2
person 70.5

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Male, 84.9%
Calm 95.6%
Sad 3.2%
Surprised 0.6%
Disgusted 0.2%
Confused 0.1%
Fear 0.1%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 36-44
Gender Male, 87.9%
Calm 99.6%
Sad 0.2%
Confused 0.2%
Happy 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

Feature analysis

Amazon

Person 98.6%
Dog 91.1%
Tie 87.5%

Captions

Microsoft

a group of people in a room 83.9%
a group of people standing in a room 78.5%
a group of people posing for a photo 59.1%

Text analysis

Amazon

6912
LIFA

Google

6912
6912