Human Generated Data

Title

Untitled (fashion portrait of woman in day wear holding stylized pose near large mirror)

Date

c. 1961-1962

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9793

Human Generated Data

Title

Untitled (fashion portrait of woman in day wear holding stylized pose near large mirror)

People

Artist: Martin Schweig, American 20th century

Date

c. 1961-1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Human 97.2
Person 97.2
Clothing 87.2
Apparel 87.2
Flooring 85.8
Floor 81.3
Furniture 75
Female 70.8
Mirror 66.2
Indoors 63.5
Room 62
Photography 61.5
Photo 61.5
Woman 60.9
Face 59.9
Portrait 59.9
Footwear 56.1
Shoe 56.1

Imagga
created on 2022-01-24

crutch 37
staff 29.6
fashion 29.4
person 28.7
portrait 27.8
people 27.3
model 27.2
adult 24.3
dress 23.5
attractive 23.1
stick 21.7
body 21.6
elegance 21
sexy 20.9
posing 20.4
clothing 19.9
pretty 18.2
human 18
lady 17.8
face 17.8
studio 15.2
black 14.6
hair 14.3
style 14.1
standing 13.9
cute 13.6
looking 13.6
clothes 13.1
sensuality 12.7
one 12.7
happy 12.5
smile 12.1
male 12.1
make 11.8
stylish 11.8
health 11.1
man 10.7
blond 10.7
women 10.3
lifestyle 10.1
long 10.1
art 10
bride 9.6
brunette 9.6
fashionable 9.5
smiling 9.4
sensual 9.1
gorgeous 9.1
pose 9.1
dancer 9
full length 8.7
happiness 8.6
eyes 8.6
statue 8.6
elegant 8.6
pink 8.4
professional 7.9
costume 7.7
hairstyle 7.6
back 7.3
domestic 7.3
makeup 7.3
lovely 7.1
gown 7.1
cleaner 7.1
medical 7.1
indoors 7

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

dress 89.8
clothing 86.2
text 85.8
person 78.2
drawing 74.3
woman 71.1
footwear 53.3

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 95.3%
Calm 64.5%
Surprised 14.9%
Happy 11.9%
Disgusted 2.8%
Angry 2.4%
Confused 1.6%
Sad 1.3%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%
Shoe 56.1%

Captions

Microsoft

a person standing in front of a window 69.2%
a man and a woman standing in front of a window 35.4%
a person standing in front of a window 35.3%

Text analysis

Amazon

MJ17--YT37A--XAX