Human Generated Data

Title

Untitled (woman by window)

Date

1970s

People

Artist: Susan Meiselas, American born 1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1860

Copyright

© Susan Meiselas / Magnum

Human Generated Data

Title

Untitled (woman by window)

People

Artist: Susan Meiselas, American born 1948

Date

1970s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99
Human 99
Furniture 98.4
Clothing 92
Apparel 92
Interior Design 91.5
Indoors 91.5
Home Decor 83.5
Sitting 81.4
Living Room 75.4
Room 75.4
Text 66.7
Portrait 62
Face 62
Photography 62
Photo 62
Couch 57.2
Shelf 56.5

Imagga
created on 2022-01-22

person 34.7
model 34.2
sexy 31.3
attractive 30.8
fashion 28.7
adult 28.2
portrait 27.8
people 26.8
black 26.6
hair 23
body 22.4
pretty 20.3
sensual 20
lady 19.5
face 19.2
skin 18.6
passion 17.9
lifestyle 17.4
man 16.8
studio 16.7
dark 16.7
nude 16.5
naked 16.4
style 16.3
erotic 16.1
posing 16
human 15.8
brunette 14.8
sensuality 14.5
women 14.2
one 14.2
sitting 13.8
sexual 13.5
love 13.4
scholar 12.3
musical instrument 12.2
hand 12.2
male 12.1
looking 12
desire 11.5
glamor 11.5
elegant 11.1
makeup 11.1
make 10.9
gorgeous 10.9
pose 10.9
dress 10.8
blond 10.4
expression 10.2
lips 10.2
elegance 10.1
stylish 10
intellectual 9.9
seductive 9.6
lying 9.4
dancer 9
clothing 8.9
keyboard instrument 8.8
breast 8.8
suit 8.8
sex 8.8
vogue 8.7
couple 8.7
couch 8.7
hands 8.7
cute 8.6
hot 8.4
emotion 8.3
room 8.2
healthy 8.2
world 8.2
happy 8.2
art 8.1
lovely 8
happiness 7.8
eyes 7.8
glamorous 7.7
underwear 7.7
youth 7.7
cover girl 7.6
legs 7.6
relaxation 7.5
product 7.5
accordion 7.5
home 7.2
romance 7.1
professional 7.1
computer 7.1
interior 7.1
look 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

indoor 97.8
person 97
wall 95.7
black and white 92.3
text 92.1
book 89.2
clothing 83.2
human face 76.4
girl 63.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 100%
Calm 94.1%
Sad 2.5%
Confused 1.3%
Surprised 0.8%
Angry 0.5%
Disgusted 0.3%
Happy 0.3%
Fear 0.2%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Couch 57.2%

Captions

Microsoft

a man sitting on a bed 83.7%
a man sitting in a room 83.6%
a man sitting on a couch 83.5%