Human Generated Data

Title

Album of Chinese Export Paintings: Women's Occupations

Date

19th century

People
Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Fogg Museum Library, 1965.39.1

Human Generated Data

Title

Album of Chinese Export Paintings: Women's Occupations

People
Date

19th century

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Art 96.4
Painting 92.3
Human 88.4
Person 88.4

Clarifai
created on 2019-05-31

painting 97.6
wear 96
retro 94.8
art 93.6
people 93.1
woman 91.6
one 91.1
illustration 89.6
adult 89.3
vintage 87.9
style 85.3
antique 85
child 82.2
old 81.6
artistic 80.3
wall 78.6
traditional 76.6
indoors 76.2
print 76.1
design 74.1

Imagga
created on 2019-05-31

wall 24.5
architecture 22.8
old 21.6
building 20.2
art 18.4
culture 16.2
door 14.3
history 14.3
vintage 14
decoration 13.2
entrance 12.5
face 11.4
travel 11.3
ancient 11.2
lady 10.5
person 10.2
traditional 10
city 10
sculpture 9.9
bookmark 9.8
graffito 9.8
post 9.5
holiday 9.3
device 9.3
religion 9
people 8.9
brown 8.8
temple 8.7
antique 8.6
grunge 8.5
historical 8.5
texture 8.3
church 8.3
fashion 8.3
letter 8.2
historic 8.2
man 8.1
detail 8
sexy 8
carving 8
doorway 7.9
arch 7.7
heritage 7.7
mail 7.6
details 7.5
wood 7.5
style 7.4
retro 7.4
street 7.4
aged 7.2
portrait 7.1
statue 7

Google
created on 2019-05-31

Illustration 79.3
Art 77.2
Painting 66.1
Laundry 55.4
Visual arts 55
Magenta 53.3

Microsoft
created on 2019-05-31

cartoon 98.5
drawing 98.3
painting 94.9
clothing 93.8
sketch 92.7
child art 90.8
person 90
human face 78.9
illustration 69.9
woman 58.7

Face analysis

Amazon

Google

AWS Rekognition

Age 26-44
Gender Female, 72.6%
Confused 9.2%
Happy 5%
Calm 25.8%
Disgusted 2.3%
Sad 35.3%
Angry 11.1%
Surprised 11.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 88.4%