Human Generated Data

Title

Untitled (handicapped girl lying on a cart next to a toy dog)

Date

c. 1970

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11457

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (handicapped girl lying on a cart next to a toy dog)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Dress 99.8
Clothing 99.8
Apparel 99.8
Female 98.2
Human 98.2
Person 95.8
Furniture 94.1
Chair 90.2
Woman 90.1
Face 89.9
Animal 87.5
Mammal 87.5
Canine 87.5
Dog 87.5
Pet 87.5
Girl 83.7
Blonde 83.7
Kid 83.7
Teen 83.7
Child 83.7
Table 83.1
Gown 81.7
Fashion 81.7
Floor 78.2
Indoors 77.3
Robe 74.8
Photography 72.2
Photo 72.2
Portrait 72.2
Baby 69.2
Room 68.8
Bed 66.5
Coat 63.8
Suit 63.8
Overcoat 63.8
Evening Dress 60.3
Sitting 60
Plant 59.9
Bedroom 59
Flooring 58.2

Imagga
created on 2022-01-14

grand piano 50.2
piano 41.5
stringed instrument 34.6
groom 31.8
percussion instrument 31.2
keyboard instrument 30.9
person 25
musical instrument 24.2
people 24
male 22.7
adult 22
negative 21.9
man 21.5
happy 18.8
film 16.3
portrait 14.9
bride 14.7
smiling 14.5
happiness 14.1
sitting 13.7
wedding 12.9
smile 12.8
love 12.6
photographic paper 12.6
laptop 12.2
fashion 12.1
fun 12
one 11.9
work 11.8
dress 11.7
cheerful 11.4
attractive 11.2
women 11.1
couple 10.4
sexy 10.4
business 10.3
black 10.3
model 10.1
face 9.9
worker 9.8
office 9.8
job 9.7
celebration 9.6
hair 9.5
wife 9.5
men 9.4
lifestyle 9.4
holiday 9.3
leisure 9.1
human 9
working 8.8
indoors 8.8
home 8.8
water 8.7
clothing 8.6
married 8.6
marriage 8.5
youth 8.5
outdoor 8.4
elegance 8.4
photographic equipment 8.4
studio 8.4
dark 8.3
computer 8.3
businessman 7.9
boy 7.8
gown 7.8
art 7.8
professional 7.7
pretty 7.7
industry 7.7
holding 7.4
safety 7.4
indoor 7.3
music 7.2
looking 7.2
steel 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

text 98.9
black and white 92
person 75.2
clothing 69.8

Face analysis

Amazon

AWS Rekognition

Age 14-22
Gender Male, 77.3%
Calm 82.7%
Sad 15.2%
Fear 0.7%
Disgusted 0.4%
Surprised 0.3%
Happy 0.3%
Confused 0.2%
Angry 0.2%

Feature analysis

Amazon

Person 95.8%
Dog 87.5%

Captions

Microsoft

a person standing in front of a window 69.9%
a person sitting in front of a window 60.6%
a person sitting next to a window 55%

Text analysis

Amazon

45055

Google

OMU
022
S
OMU VT102 022 4505 S
4505
VT102