Human Generated Data

Title

Book

Date

-

People

-

Classification

Manuscripts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Museum Collection, 1978.477.80

Human Generated Data

Title

Book

Classification

Manuscripts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Museum Collection, 1978.477.80

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Person 98.7
Human 98.7
Person 97.8
Person 92.7
Art 88.6
Person 81.8
Drawing 77.5
Modern Art 72.7
People 69.4
Text 66.9
Silhouette 65.6
Duel 59.3
Doodle 56

Clarifai
created on 2019-07-07

people 99.4
illustration 99.3
group 99.3
man 98.7
adult 95.4
silhouette 93.8
wear 93.7
woman 92.8
painting 92.7
art 91.4
outfit 90.5
two 89.6
music 89.4
cooperation 89
print 88.9
dancing 88.9
retro 88.7
action 87.9
weapon 87.4
competition 84.7

Imagga
created on 2019-07-07

silhouette 48
sketch 42
drawing 39.2
dance 37.1
art 28.4
weevil 27.6
black 27.1
representation 25.1
beetle 22.9
people 20.1
insect 19.5
man 19.5
creation 17.4
sport 16.5
male 16.3
animal 16.2
arthropod 15.3
horse 15.2
grunge 12.8
sunset 10.8
design 10.7
silhouettes 10.7
men 10.3
person 10.1
competition 9.1
active 9
fun 9
team 9
group 8.9
graphic 8.7
ride 8.7
dancing 8.7
race 8.6
outline 8.5
old 8.4
action 8.3
sports 8.3
paint 8.1
dirty 8.1
sun 8
tripod 8
boy 7.9
couple 7.8
party 7.7
ink 7.7
sky 7.6
decoration 7.6
friends 7.5
shape 7.5
element 7.4
mammal 7.3
exercise 7.3
recreation 7.2
activity 7.2

Google
created on 2019-07-07

Art 76.7
Illustration 75.9
Drawing 69.1
Visual arts 59.3

Microsoft
created on 2019-07-07

text 99.5
drawing 98.8
sketch 97.5
book 94.9
cartoon 94.1
art 74.7
child art 72.5
illustration 71.3

Color Analysis

Feature analysis

Amazon

Person 98.7%

Categories

Imagga

paintings art 98.5%
pets animals 1.2%

Captions

Microsoft
created on 2019-07-07

a group of people looking at a book 28.1%