Human Generated Data

Title

Schematic Figure

Date

c. 2500 BCE

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Mr. and Mrs. Norbert Schimmel, 1965.519

Human Generated Data

Title

Schematic Figure

People
Date

c. 2500 BCE

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Figurine 93.6
Vegetable 90.2
Gourd 90.2
Produce 90.2
Food 90.2
Plant 90.2
Snow 74.1
Outdoors 74.1
Nature 74.1
Winter 74.1
Snowman 74.1
Pottery 72.4
Porcelain 59
Art 59
Bronze 56.9

Clarifai
created on 2018-03-16

no person 99.9
one 98
food 95.9
grow 95.6
still life 94.6
two 91.6
simplicity 89.8
ingredients 89.4
people 89.2
pottery 87.4
cooking 86.2
container 85.7
cutout 85
fungus 82.9
wood 82
group 81.8
tableware 79.8
handmade 79.4
nutrition 77.9
art 76.7

Imagga
created on 2018-03-16

vase 100
jar 80.4
vessel 67.5
container 63.2
china 33.5
ceramic ware 31.3
porcelain 30.6
utensil 21.5
pottery 15.1
bottle 14.1
food 13.7
earthenware 13.3
brown 13.2
glass 13.2
decoration 12.3
object 11.7
ceramic 11.6
pitcher 11.6
healthy 11.3
soap dispenser 10.4
health 9.7
diet 9.7
craft 9.5
dispenser 9.3
jug 9.3
art 9.3
fresh 9.1
ingredient 8.8
culture 8.5
traditional 8.3
milk 8.2
clay 7.8
nobody 7.8
flower 7.7
old 7.7
stone 7.6
fruit 7.5
economy 7.4
close 7.4
care 7.4
currency 7.2
sweet 7.1
animal 7.1

Google
created on 2018-03-16

sculpture 84.4
artifact 75.5
figurine 60.6

Microsoft
created on 2018-03-16

ceramic ware 50.9
jar 25.6
stoneware 15.2

Color Analysis

Feature analysis

Amazon

Snowman 74.1%

Captions

Microsoft

a vase sitting on a table 16.1%
a white vase on a table 16%
a white vase 15.9%

Text analysis

Amazon

/065510