Human Generated Data

Title

Untitled (studio portrait of seated woman with hat and coat holding on to infant in wheeled carriage)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3713

Human Generated Data

Title

Untitled (studio portrait of seated woman with hat and coat holding on to infant in wheeled carriage)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 96.4
Human 96.4
Person 93.6
Apparel 93.6
Clothing 93.6
Furniture 87.3
Machine 86.7
Wheel 86.7
Face 77.4
Photo 70
Portrait 70
Photography 70
Transportation 67.9
Bike 67.9
Bicycle 67.9
Vehicle 67.9
People 66.4
Hat 59.3
Sleeve 57.3

Clarifai
created on 2019-06-01

people 99.9
adult 98.8
man 97.9
two 97.7
wear 96.8
one 96.2
woman 94.3
vehicle 93.5
transportation system 88.4
child 87.5
sitting 87.1
sit 86.3
veil 85.7
group 83.9
group together 83.3
three 82.1
leader 81.3
outfit 81.2
military 78.7
actor 76.7

Imagga
created on 2019-06-01

person 25.5
sketch 22
people 21.2
man 18.8
adult 16.8
drawing 16.4
bride 15.4
happiness 14.9
negative 14.4
portrait 14.2
love 14.2
dress 13.5
male 13.5
mother 13.2
human 12.7
patient 12.4
representation 12.4
wedding 11.9
old 11.8
men 11.2
cheerful 10.6
groom 10.4
film 10
family 9.8
couple 9.6
women 9.5
work 9.4
religion 9
gown 8.8
clothing 8.8
bouquet 8.7
case 8.6
face 8.5
care 8.2
statue 8.1
history 8
child 7.9
medical 7.9
grandfather 7.8
sick person 7.8
married 7.7
winter 7.7
health 7.6
historical 7.5
happy 7.5
vintage 7.4
decoration 7.4
historic 7.3
art 7.3
sexy 7.2
lifestyle 7.2
black 7.2
celebration 7.2
marble 7.2
smile 7.1
working 7.1
medicine 7
nurse 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

outdoor 95
clothing 89.4
old 86.5
black and white 85.8
person 82.2
human face 64.6
white 63.8
hat 52.3
posing 43.4

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 94.2%
Happy 7.2%
Sad 35.4%
Confused 9.9%
Angry 5.2%
Disgusted 16%
Surprised 6.1%
Calm 20.2%

Feature analysis

Amazon

Person 96.4%
Wheel 86.7%
Bicycle 67.9%

Captions

Microsoft

a vintage photo of a person 92.4%
an old black and white photo of a person 89.5%
an old photo of a person 89.4%