Human Generated Data

Title

Untitled (studio portrait of child wearing white dress standing by chair)

Date

c. 1905-1915, printed c. 1970

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5989

Human Generated Data

Title

Untitled (studio portrait of child wearing white dress standing by chair)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Apparel 100
Clothing 100
Hat 99.9
Bonnet 99.9
Person 99
Human 99
Person 95.6
Painting 65.2
Art 65.2
Toy 61

Clarifai
created on 2019-11-16

people 99.9
actress 98.9
theater 98.5
sit 98.5
woman 98.4
wear 97.8
adult 97.4
indoors 97
child 96.5
group 95.7
furniture 95.7
actor 95.5
room 95.2
music 94.5
sitting 93.8
dancer 93.3
opera 93
man 92.2
family 90.8
movie 90.7

Imagga
created on 2019-11-16

kin 35.6
groom 23.4
people 19
world 17.7
portrait 16.8
bride 16.7
sculpture 15.2
dress 14.5
religion 14.3
man 14.2
person 14
statue 14
face 13.5
male 13.5
couple 13.1
art 13
romantic 12.5
love 11.8
happiness 11.8
fashion 11.3
happy 11.3
sexy 11.2
old 11.2
stone 11.1
black 10.9
adult 10.4
historical 10.4
monument 10.3
architecture 10.2
wedding 10.1
one 9.7
body 9.6
ancient 9.5
dark 9.2
history 9
bridal 8.8
clothing 8.6
culture 8.6
marriage 8.5
bouquet 8.5
religious 8.4
head 8.4
attractive 8.4
famous 8.4
traditional 8.3
city 8.3
vintage 8.3
light 8
family 8
celebration 8
hair 7.9
antique 7.8
god 7.7
studio 7.6
passion 7.5
fun 7.5
style 7.4
detail 7.2
smile 7.1
doll 7.1
posing 7.1
look 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.2
human face 97.6
clothing 97.5
person 95.9
baby 89.4
black and white 81.6
smile 78.2
toddler 76.7
woman 62.3
child 53.5
old 52.8
posing 36.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 71.8%
Disgusted 0.2%
Confused 0.5%
Angry 0.5%
Happy 0.5%
Sad 83.7%
Calm 9.9%
Surprised 0.2%
Fear 4.7%

AWS Rekognition

Age 0-3
Gender Female, 90.5%
Happy 0.1%
Confused 3.1%
Calm 56.2%
Fear 0.8%
Angry 21.6%
Surprised 2.2%
Disgusted 0.5%
Sad 15.5%

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 0
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Painting 65.2%

Captions

Microsoft

an old photo of a girl 65.1%
an old photo of a person 65%
old photo of a girl 60.3%