Human Generated Data

Title

Untitled (formally dressed men and woman, Philadelphia, PA)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8220

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dressed men and woman, Philadelphia, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98
Human 98
Person 97.3
Clothing 97.3
Apparel 97.3
Canine 71.9
Animal 71.9
Mammal 71.9
Pet 67.5
Dog 66.5
Poodle 66.5
Wildlife 64.8
Bear 64.8
Sunglasses 63.9
Accessories 63.9
Accessory 63.9
Face 60.7
Coat 58.5
Overcoat 58.5
Couch 57.9
Furniture 57.9
Sheep 57.7
Hair 57.2

Imagga
created on 2022-01-08

man 36.9
people 34.6
male 31.2
person 30.8
hairdresser 27.5
senior 25.3
grandma 22.5
nurse 21.5
adult 21.2
medical 21.2
patient 21
old 19.5
men 18
coat 17.5
home 17.5
portrait 17.5
hospital 17.1
happy 16.9
doctor 16
couple 15.7
elderly 15.3
room 14.7
smiling 14.5
husband 14.3
family 14.2
together 14
professional 13.8
smile 13.5
lab coat 13.5
women 13.4
wife 13.3
retired 12.6
health 12.5
retirement 12.5
specialist 12.2
face 12.1
surgeon 12
love 11.8
illness 11.4
looking 11.2
mature 11.2
hair 11.1
team 10.7
worker 10.7
medicine 10.6
sitting 10.3
lifestyle 10.1
hand 9.9
indoors 9.7
work 9.4
happiness 9.4
sick person 9.3
grandfather 9.2
case 9.1
holding 9.1
care 9.1
operation 8.9
surgery 8.8
garment 8.2
mother 7.9
cap 7.9
chair 7.9
grandmother 7.8
hands 7.8
lab 7.8
shower cap 7.8
laboratory 7.7
expression 7.7
loving 7.6
clothing 7.6
equipment 7.4
camera 7.4
cheerful 7.3
mask 7.3
lady 7.3
business 7.3
office 7.2
black 7.2
working 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 89.3
black and white 82.3
cat 51.9
human face 51
reflection 50.4

Face analysis

Amazon

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Sad 46.8%
Surprised 23.2%
Angry 17.1%
Calm 3.5%
Fear 3.1%
Disgusted 2.9%
Confused 1.9%
Happy 1.6%

Feature analysis

Amazon

Person 98%
Bear 64.8%
Sunglasses 63.9%
Sheep 57.7%

Captions

Microsoft

a man holding a cat in front of a mirror 75.2%
a man holding a cat in front of a mirror posing for the camera 67.2%
a man and a cat in front of a mirror 67.1%

Text analysis

Amazon

6898
689.8

Google

6898
6898