Human Generated Data

Title

Untitled (seated woman)

Date

c. 1910

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.702

Human Generated Data

Title

Untitled (seated woman)

People

Artist: Martin Schweig, American 20th century

Date

c. 1910

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 97.6
Apparel 97.6
Furniture 97.1
Human 96.5
Person 96.5
Female 78.5
Robe 69.3
Evening Dress 69.3
Fashion 69.3
Gown 69.3
Corset 65.8
Woman 63.7
Flooring 62.9
Door 61.7
Leisure Activities 59.3
Costume 55.7
Cabinet 55

Imagga
created on 2022-01-08

sexy 42.6
fashion 42.2
model 38.9
attractive 38.5
pretty 33.6
adult 29.9
dress 28.9
portrait 27.8
outfit 27
hair 26.9
lady 26
style 26
person 25.8
sensual 22.7
posing 22.2
gorgeous 21.7
stylish 21.7
black 21
makeup 20.1
cute 20.1
hairstyle 20
people 19.5
face 18.5
elegance 17.6
body 17.6
sensuality 16.3
blond 15.7
happy 15.7
make 15.4
studio 15.2
smile 15
feminine 14.9
holding 14.9
brunette 14.8
fashionable 14.2
happiness 14.1
standing 13.9
elegant 13.7
human 13.5
mother 12.9
luxury 12.9
women 12.7
lovely 12.4
interior 12.4
sitting 12
glamorous 11.6
vintage 11.6
skin 11
pose 10.9
lifestyle 10.8
musical instrument 10.7
erotic 10.4
dancer 10.4
chair 10.2
performer 10.1
music 9.9
retro 9.8
glamor 9.6
legs 9.4
guitar 9.4
parent 9.3
classic 9.3
clothing 9.1
one 9
cheerful 8.9
looking 8.8
vogue 8.7
wall 8.6
musician 8.5
nice 8.2
bag 8.2
smiling 8
look 7.9
provocative 7.8
modern 7.7
expression 7.7
passion 7.5
clothes 7.5
lingerie 7.4
20s 7.3
skirt 7.3
indoors 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.6
clothing 97.7
dress 97
woman 90.9
smile 86.9
person 85.8
human face 76.9
girl 50.4
old 45.3
vintage 28.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-36
Gender Female, 100%
Calm 98.3%
Surprised 0.6%
Confused 0.4%
Happy 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Sad 0.1%

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.5%

Captions

Microsoft

a vintage photo of a woman 89.3%
a vintage photo of a woman sitting in a chair 87.9%
a vintage photo of a woman sitting in a room 87.8%