Human Generated Data

Title

Untitled (woman in a costume with make-up artist backstage)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5679

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman in a costume with make-up artist backstage)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 99.9
Apparel 99.9
Furniture 99.7
Chair 99.7
Person 98.4
Human 98.4
High Heel 97.2
Shoe 97.2
Footwear 97.2
Chair 94.9
Table 80.1
Female 79.1
Interior Design 74.6
Indoors 74.6
Sitting 72
Woman 62.1
Photography 61
Photo 61
Room 60.2
Dining Table 56.7
Cafe 55.9
Restaurant 55.9
High Heel 55.3

Imagga
created on 2021-12-15

sexy 34.5
adult 28.5
attractive 25.2
person 24.2
sensual 21.8
black 20.6
fashion 20.4
chair 20
people 19.5
body 19.2
lady 18.7
interior 18.6
pretty 18.2
hair 17.4
sitting 17.2
model 17.1
indoors 16.7
style 16.3
erotic 16.3
portrait 16.2
blond 14.5
women 14.2
furniture 14.2
device 14
gorgeous 13.6
home 13.6
skin 13.5
lifestyle 13
one 12.7
elegance 12.6
youth 11.9
sensuality 11.8
dress 11.7
house 11.7
seductive 11.5
luxury 11.1
elegant 11.1
lingerie 10.8
studio 10.6
seat 10.5
brunette 10.5
legs 10.4
cute 10
rocking chair 10
clothing 10
happy 10
posing 9.8
room 9.8
human 9.7
looking 9.6
apartment 9.6
face 9.2
relaxation 9.2
hot 9.2
retro 9
man 8.7
shoes 8.6
eyes 8.6
performer 8.6
health 8.3
slim 8.3
appliance 8.2
salon 8.1
bedroom 8.1
home appliance 8
musical instrument 7.9
machine 7.9
vertical 7.9
long hair 7.8
bed 7.6
dishwasher 7.5
fun 7.5
vintage 7.4
lips 7.4
cheerful 7.3
indoor 7.3
instrument 7.2
smile 7.1
love 7.1
look 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.5
chair 95
furniture 94.6
person 75
clothing 71.7
table 71
black and white 69.5
footwear 60.2
computer 34.4

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 96.2%
Calm 91.2%
Surprised 2.9%
Sad 2.2%
Happy 1.7%
Confused 0.9%
Angry 0.8%
Fear 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.7%
Person 98.4%
High Heel 97.2%

Captions

Microsoft

a person sitting on a chair 75.1%
a person sitting in a chair 75%
a man and a woman sitting on a chair 40.9%

Text analysis

Amazon

13860
60.
13860.
T-T-

Google

13860· AGON-YT3RA2 MAMT2A3 13860. 13860.
13860·
AGON-YT3RA2
MAMT2A3
13860.