Human Generated Data

Title

Untitled (girl leaning on table with basket)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1236

Human Generated Data

Title

Untitled (girl leaning on table with basket)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1236

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 97.8
Human 97.8
Clothing 94.9
Apparel 94.9
Art 79.1
Female 59.7
Pottery 57.5
Painting 56.7
Dress 55.7

Clarifai
created on 2023-10-27

portrait 99.7
people 99.6
one 99.3
art 99.2
girl 98.8
dress 98.6
wear 98.5
retro 97.8
child 96.4
woman 95.8
adult 94.6
model 93.2
vintage 92.1
lid 91.8
princess 90.2
fashion 89.7
doll 85.7
magic 85.6
room 84.8
nostalgia 84.1

Imagga
created on 2022-01-23

fashion 44.5
sexy 42.6
dress 42.5
model 35.8
pretty 32.9
portrait 32.4
attractive 32.2
adult 31.8
costume 29.2
skirt 28.8
person 28.1
clothing 27.9
hair 27.8
people 26.2
brunette 24.4
studio 24.3
domestic 23.9
doll 23.9
lady 23.6
black 22.7
posing 22.2
face 21.3
outfit 21.3
elegance 21
style 20.8
cute 20.1
sensual 20
sensuality 20
make 19.1
plaything 18.4
happy 18.2
hairstyle 17.2
smile 16.4
garment 15.7
gorgeous 15.4
stylish 14.5
women 14.2
youth 13.6
pose 13.6
fashionable 13.3
elegant 12.9
makeup 12.8
blond 12.7
glamor 12.5
clothes 12.2
expression 12
smiling 11.6
feminine 11.2
eyes 11.2
lips 11.1
happiness 11
lovely 10.7
cheerful 10.6
modern 10.5
look 10.5
wearing 10.5
dance 10.5
dancer 9.9
hand 9.9
body 9.6
child 9.2
fun 9
vogue 8.7
seductive 8.6
party 8.6
figure 8.4
pink 8.4
long 8.3
human 8.3
one 8.2
looking 8
standing 7.8
covering 7.7
bride 7.7
skin 7.6
dark 7.5
joy 7.5
vintage 7.4
brown 7.4
performer 7.4
lifestyle 7.2
interior 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

wall 99.4
text 99.1
indoor 96
dress 95.4
clothing 94.5
person 93.2
human face 91.8
black 73.1
retro style 72.8
portrait 70.7
sketch 68.3
vintage clothing 67.8
girl 66.3
skirt 58.8
fashion 58.2
woman 50.2
posing 37.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Female, 82.7%
Calm 99.9%
Sad 0%
Surprised 0%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.8%

Categories

Imagga

paintings art 73.7%
people portraits 24.3%