Human Generated Data

Title

Untitled (man standing with prosthetic leg)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8250

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man standing with prosthetic leg)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Apparel 98.9
Clothing 98.9
Person 98.3
Human 98.3
Shoe 70.3
Footwear 70.3
Dance 70.3
Coat 69.7
Sleeve 66.4
Room 65.6
Indoors 65.6
Costume 65.6
Overcoat 64.1
Female 63.9
Suit 62.9
Furniture 62.8
Chair 62.8
Door 61.9
Advertisement 60.5
Pants 60.5
Poster 59.7
Collage 59.7
Underwear 58.1
Lingerie 57.1
Evening Dress 57
Gown 57
Fashion 57
Robe 57
Shorts 56.5
Person 49.5

Imagga
created on 2022-01-08

sexy 32.9
fashion 31.7
portrait 28.5
person 27.4
attractive 25.9
adult 25.5
model 24.9
hair 23.8
lady 23.5
pretty 23.1
dress 21.7
posing 21.3
style 20
sensuality 20
people 19.5
body 19.2
human 18.8
black 17.8
lifestyle 16.6
elegance 16
cute 15.8
legs 15.1
clothing 15
sensual 14.6
women 14.2
face 14.2
skin 13.5
one 13.4
lovely 13.3
elegant 12.9
man 12.1
wall 12
street 12
stylish 11.8
erotic 11.7
vogue 11.6
sexual 11.6
urban 11.4
window 11.2
slim 11
gorgeous 10.9
dark 10.9
newspaper 10.7
studio 10.6
brunette 10.5
standing 10.4
makeup 10.1
happy 10
girls 10
garment 10
city 10
pose 10
femininity 9.7
passion 9.4
lips 9.3
head 9.2
bathroom 9.2
shower 8.9
water 8.7
high 8.7
product 8.6
male 8.6
smile 8.6
fashionable 8.5
expression 8.5
hot 8.4
looking 8
night 8
blond 8
love 7.9
look 7.9
nude 7.8
eyes 7.7
naked 7.7
old 7.7
serious 7.6
vintage 7.4
make 7.3
skirt 7.2
wet 7.2
bright 7.2
cool 7.1
interior 7.1
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.9
man 94.6
drawing 93.2
book 92.3
person 90.5
black and white 90
clothing 89.5
sketch 60.5

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 100%
Happy 46.1%
Calm 30.6%
Confused 15.9%
Surprised 2.7%
Sad 2%
Disgusted 1.6%
Angry 0.6%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Shoe 70.3%

Captions

Microsoft

an old photo of a man 72.7%
old photo of a man 71.5%
a man standing next to a book 31.1%

Text analysis

Amazon

7656.
A70A
MSIA YESTAD A70A
MSIA
YESTAD
3123477

Google

A7OA
6.
YT37A2
765
MSI7
765 6. MSI7 YT37A2 A7OA