Human Generated Data

Title

Untitled (young woman seated sideways in chair reading card)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3735

Human Generated Data

Title

Untitled (young woman seated sideways in chair reading card)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.9
Human 98.9
Reading 97.8
Female 67.9
Girl 65
Apparel 62.6
Clothing 62.6
Photography 61.1
Photo 61.1
Face 61.1
Portrait 61.1
Text 58

Clarifai
created on 2019-06-01

people 99.4
one 97.8
portrait 95.3
adult 95.2
child 93.4
monochrome 90.6
man 90.4
wear 87.5
retro 87.5
woman 86.3
art 84.4
indoors 79
sculpture 78.6
veil 78.4
paper 77.1
girl 75.3
music 74.1
vertical 73.2
statue 72.2
religion 70.5

Imagga
created on 2019-06-01

shower cap 100
cap 100
headdress 100
clothing 66.6
consumer goods 33.9
covering 33.5
person 31.8
portrait 28.5
adult 28.5
people 26.2
face 24.2
smile 22.1
man 21.5
male 21.3
happy 20.1
hair 18.2
smiling 17.4
human 17.3
love 16.6
looking 16
home 16
one 15.7
happiness 15.7
health 15.3
care 14.8
lifestyle 14.5
casual 13.6
women 13.4
medical 13.2
lady 13
men 12.9
bride 12.5
doctor 12.2
cheerful 12.2
professional 11.8
worker 11.6
indoors 11.4
pretty 11.2
model 10.9
medicine 10.6
attractive 10.5
couple 10.5
sitting 10.3
work 10.2
wedding 10.1
cute 10
veil 9.8
eyes 9.5
nurse 9.4
mature 9.3
alone 9.1
hand 9.1
fashion 9.1
body 8.8
mask 8.6
expression 8.5
relaxation 8.4
house 8.4
room 8.2
girls 8.2
dress 8.1
child 8.1
life 8.1
family 8
mid adult 7.7
married 7.7
elderly 7.7
studio 7.6
joy 7.5
emotion 7.4
occupation 7.3
window 7.3
childhood 7.2
interior 7.1
working 7.1
modern 7
look 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 75.7%
Disgusted 2.9%
Happy 2.2%
Surprised 2.9%
Sad 9.7%
Angry 3.9%
Confused 3.5%
Calm 74.9%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a person sitting on a bed 49.3%
a person sitting on a bed 49.2%
a man and a woman sitting on a bed 27.2%