Human Generated Data

Title

Untitled (sleeping girl)

Date

c. 1935

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.977

Human Generated Data

Title

Untitled (sleeping girl)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.977

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Painting 95.8
Art 95.8
Person 91.3
Human 91.3
Home Decor 69.7
Handrail 57.9
Banister 57.9
Clothing 57.3
Apparel 57.3

Clarifai
created on 2023-10-25

portrait 99.4
woman 99.2
art 98.8
people 98.4
girl 97.9
one 96.9
sepia 96.7
adult 96.6
nude 96.5
vintage 96.1
retro 94.1
model 92.6
wear 90.6
sculpture 90
gold 89.3
fashion 88.2
old 87.8
child 87.6
man 86.9
sexy 86.9

Imagga
created on 2022-01-09

sexy 39.4
cover girl 35.5
model 34.2
fashion 32.4
body 29.6
portrait 29.1
lady 28.4
adult 27.2
attractive 26.6
people 26.2
hair 26.2
pretty 25.9
face 24.9
person 24.4
skin 22.9
style 22.3
black 21.7
posing 21.3
sensuality 20
blond 19.4
naked 19.3
sensual 18.2
human 18
erotic 17.5
dark 16.7
cute 16.5
sexual 16.4
elegant 16.3
passion 15.1
women 15
lips 14.8
make 14.5
desire 14.4
elegance 14.3
art 14.3
man 13.8
pose 13.6
one 13.4
child 13.2
love 12.6
glamor 12.5
couple 12.2
studio 12.2
male 11.7
nude 11.6
lifestyle 11.6
brunette 11.3
dress 10.9
vintage 10.8
hand 10.6
interior 10.6
world 10.6
old 10.5
looking 10.4
room 10.3
figure 10.3
expression 10.2
sculpture 9.9
romantic 9.8
sex 9.7
vogue 9.7
underwear 9.7
seductive 9.6
eyes 9.5
makeup 9.5
lingerie 9.1
smasher 9.1
gorgeous 9.1
religion 9
lovely 8.9
look 8.8
sitting 8.6
youth 8.5
two 8.5
stylish 8.1
bare 7.8
luxury 7.7
hairstyle 7.6
furniture 7.4
retro 7.4
romance 7.1
doll 7.1
clothing 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

human face 98.7
person 96.5
text 95.8
girl 91.8
portrait 85.9
woman 64.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 2-8
Gender Female, 100%
Calm 51.4%
Sad 45.6%
Angry 1%
Disgusted 0.5%
Fear 0.5%
Happy 0.4%
Confused 0.3%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 95.8%
Person 91.3%

Categories

Imagga

paintings art 55.5%
pets animals 38.4%
food drinks 4.6%

Captions