Human Generated Data

Title

Untitled (proof print: woman in head scarves and long beaded necklaces)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1160

Human Generated Data

Title

Untitled (proof print: woman in head scarves and long beaded necklaces)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1160

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99
Human 99
Clothing 97.1
Apparel 97.1
Female 91.5
Floor 83.5
Woman 82.4
Dress 64.3
Costume 63.7
Face 58.7

Clarifai
created on 2023-10-26

people 99.8
portrait 98.5
woman 98.1
one 97.9
adult 97.4
fashion 96.4
model 95.7
wear 95.3
monochrome 94.2
girl 92.2
retro 91.9
vintage 91.8
dress 89.4
music 87.1
art 85.4
street 80.1
man 76.7
sexy 74.4
costume 72.9
black and white 72.9

Imagga
created on 2022-01-22

adult 23.7
fashion 23.4
portrait 23.3
dress 22.6
person 21.2
attractive 20.3
people 20.1
lady 19.5
model 18.7
sexy 18.5
outfit 18.2
clothing 17.1
interior 15.9
teacher 15.7
man 15.4
pretty 15.4
male 14.9
vintage 14.9
old 14.6
happy 13.8
bride 13.4
gown 13.3
standing 13
sensual 12.7
love 12.6
elegance 12.6
couple 12.2
looking 12
black 12
hair 11.9
room 11.6
groom 11.5
educator 11.3
home 11.2
style 11.1
posing 10.7
brunette 10.4
luxury 10.3
culture 10.3
wedding 10.1
make 10
garment 9.7
body 9.6
wall 9.5
vestment 9.4
indoor 9.1
life 9.1
human 9
one 9
professional 8.9
family 8.9
domestic 8.7
ancient 8.6
men 8.6
elegant 8.6
smile 8.5
clothes 8.4
retro 8.2
lifestyle 7.9
indoors 7.9
color 7.8
two 7.6
marriage 7.6
bouquet 7.5
religious 7.5
traditional 7.5
blond 7.4
costume 7.3
sensuality 7.3
gorgeous 7.2
art 7.2
religion 7.2
history 7.2
face 7.1
happiness 7.1
together 7
look 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.6
standing 96.2
person 96.1
woman 95.8
posing 92.5
holding 90.6
dress 87.8
handwriting 86.9
drawing 83.1
clothing 82.7
dressed 29.8
clothes 17.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 52-60
Gender Female, 100%
Calm 80.6%
Confused 12.4%
Sad 4%
Angry 1.1%
Happy 0.7%
Surprised 0.6%
Disgusted 0.4%
Fear 0.2%

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Text analysis

Amazon

over
white
white grind drope
drope
grind

Google

overo Mankecd
overo
Mankecd