Human Generated Data

Title

Untitled (portrait of woman standing with elbow on cabinet, hands clasped)

Date

c. 1940, printed later

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13208

Human Generated Data

Title

Untitled (portrait of woman standing with elbow on cabinet, hands clasped)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13208

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 98.4
Apparel 98.4
Person 97
Human 97
Robe 79.4
Fashion 79.4
Female 69.2
Coat 67.9
Overcoat 67.9
Evening Dress 63.7
Gown 63.7
Art 61.2
Photography 60.4
Photo 60.4
Face 60.4
Portrait 60.4
Door 58.8
Sleeve 58.5
Indoors 56.6
Floor 56.2

Clarifai
created on 2019-11-16

people 99.9
one 99.2
adult 99
portrait 98.1
man 96.7
woman 95.5
furniture 93.4
wear 93.4
music 92.6
monochrome 91.1
indoors 89.8
room 89.1
gown (clothing) 88.5
musician 88.4
sit 88
seat 86.9
two 85
piano 84.2
chair 82.8
art 80.6

Imagga
created on 2019-11-16

person 26.5
people 22.9
portrait 21.3
man 20.8
fashion 20.3
adult 19.7
sexy 19.3
black 18.3
lady 17.8
attractive 17.5
male 17.5
refrigerator 17.2
model 15.5
white goods 15.1
hair 15.1
pretty 14
dress 13.6
dark 13.4
posing 13.3
happy 13.2
face 12.8
interior 12.4
home appliance 12.1
world 12
human 12
body 12
home 12
one 11.9
indoor 11.9
brunette 11.3
skin 11
sensual 10.9
sensuality 10.9
room 10.5
old 10.4
windowsill 10.4
eyes 10.3
elegance 10.1
make 10
smile 10
religion 9.9
lifestyle 9.4
expression 9.4
device 9.1
sill 8.9
clothing 8.8
looking 8.8
couple 8.7
standing 8.7
love 8.7
window 8.4
vintage 8.3
appliance 8.2
statue 8.1
upright 7.9
boy 7.9
indoors 7.9
cute 7.9
look 7.9
elegant 7.7
healthy 7.6
light 7.4
style 7.4
inside 7.4
structural member 7.3
makeup 7.3
art 7.2
night 7.1
child 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 96.4
indoor 94.9
human face 93.4
clothing 92.3
person 88.2
black and white 87.1
smile 79.4
standing 78.3
black 68.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-44
Gender Female, 93.3%
Angry 1.6%
Happy 9.2%
Disgusted 1.3%
Confused 1.2%
Surprised 0.8%
Calm 84.3%
Fear 0.4%
Sad 1.3%

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97%

Categories

Text analysis

Amazon

1-J011