Human Generated Data

Title

Untitled (studio portrait of costumed woman reclining in chair holding cigarette in holder)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3760

Human Generated Data

Title

Untitled (studio portrait of costumed woman reclining in chair holding cigarette in holder)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.2
Person 99.2
Clothing 98.6
Apparel 98.6
Furniture 91.8
Chair 88.6
Shorts 83.5
Footwear 71.2
Shoe 71.2
Portrait 62.2
Photography 62.2
Face 62.2
Photo 62.2
Female 61
Flooring 58.8
Text 57

Clarifai
created on 2019-06-01

people 99.8
one 99.4
adult 99.2
man 96.9
wear 96.6
two 94.7
woman 92.9
monochrome 89.7
furniture 88.6
sit 87.8
portrait 86.7
indoors 86
outfit 84.1
leader 83.9
position 79.8
group 79.4
actor 79
group together 78.4
sports equipment 77.6
administration 76.3

Imagga
created on 2019-06-01

negative 34.7
film 28.7
photographic paper 21.4
portrait 18.1
person 15.4
art 15.2
adult 14.6
black 14.4
photographic equipment 14.3
people 13.9
dress 13.5
face 12.8
sculpture 12.7
statue 12.5
body 12
model 11.7
lady 11.4
fashion 11.3
attractive 11.2
lifestyle 10.8
crutch 10.6
old 10.4
sexy 10.4
city 10
one 9.7
standing 9.6
wall 9.4
grunge 9.4
light 9.4
world 9.2
pretty 9.1
building 8.8
man 8.7
naked 8.7
windowsill 8.6
culture 8.5
head 8.4
traditional 8.3
human 8.2
alone 8.2
staff 8.2
pose 8.1
history 8
stick 8
hair 7.9
women 7.9
urban 7.9
travel 7.7
luxury 7.7
bride 7.7
skin 7.6
style 7.4
detail 7.2
dirty 7.2
sill 7.1
posing 7.1
male 7.1
look 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

sketch 98.5
drawing 98.2
painting 79.5
clothing 77.1
black and white 76.7
person 74.1

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 54.4%
Angry 45.2%
Happy 45.2%
Confused 45.1%
Sad 54.1%
Calm 45.2%
Surprised 45.1%
Disgusted 45.1%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a black and white photo of a person 76%
an old photo of a person 75.9%
old photo of a person 75.8%

Text analysis

Amazon

WXXX
NN