Human Generated Data

Title

Untitled (woman posed on chair mirroring a small painting on the floor)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10618

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman posed on chair mirroring a small painting on the floor)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.3
Apparel 99.3
Monitor 95.3
Screen 95.3
Electronics 95.3
Display 95.3
Human 93.7
Person 93.2
Dress 89
Female 87.8
Woman 71.1
LCD Screen 69.9
TV 67.1
Television 67.1
Portrait 63.9
Photography 63.9
Face 63.9
Photo 63.9
Girl 60.7
Furniture 58.4
Door 58

Imagga
created on 2022-01-09

dancer 41.2
fashion 39.2
dress 38
attractive 37.1
model 36.6
person 35.4
sexy 33.7
performer 32.1
pretty 28.7
elegance 27.7
lady 27.6
portrait 27.2
hair 24.6
adult 24.4
body 24
black 23.7
posing 22.2
face 22
style 20.8
make 20
elegant 19.7
people 19.5
entertainer 19.2
sensual 19.1
sensuality 19.1
studio 19
clothing 18.2
gorgeous 16.3
pose 16.3
outfit 15.9
happy 15.7
hairstyle 15.3
stylish 14.5
human 14.3
fashionable 14.2
makeup 13.7
cute 13.6
seductive 12.4
dance 11.9
costume 11.6
blond 11.5
bride 11.5
brunette 11.3
luxury 11.2
skin 11.1
women 11.1
dinner dress 10.8
smile 10.7
lovely 10.7
vogue 10.7
sitting 10.3
feminine 10.3
classic 10.2
lips 10.2
dark 10
art 9.9
gown 9.8
chair 9.6
eyes 9.5
clothes 9.4
hand 9.1
sculpture 8.9
figure 8.9
looking 8.8
erotic 8.7
lifestyle 8.7
party 8.6
formal 8.6
stage 8.5
vintage 8.3
nice 8.3
look 7.9
vertical 7.9
happiness 7.8
statue 7.8
bridal 7.8
glamorous 7.7
performance 7.7
head 7.6
legs 7.6
evening 7.5
wedding 7.4
back 7.3
love 7.1
interior 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.3
person 88.2
dance 75.5
statue 60.1

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 99.1%
Happy 50.5%
Calm 33.6%
Surprised 11.7%
Disgusted 1.2%
Angry 1.1%
Sad 1%
Fear 0.5%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.2%

Captions

Microsoft

a person standing in front of a laptop 44.3%
a person jumping up in the air 34%
a person standing in a room 33.9%

Text analysis

Amazon

YOORASHAGOX