Human Generated Data

Title

Jar

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.145

Human Generated Data

Title

Jar

People
Date

Classification

Vessels

Machine Generated Data

Tags

Imagga
created on 2018-12-21

pot 100
vessel 89.6
cooking utensil 75.6
container 72.9
utensil 55.7
kitchen utensil 50.3
earthenware 48.4
traditional 34.1
ceramic ware 32.7
drink 30.1
ceramic 28.1
cup 27.4
pottery 26.7
vase 24
tea 23.5
china 22.3
culture 21.4
old 20.9
brown 20.6
clay 18.5
object 18.3
ceramics 17.6
decoration 17.4
art 16.9
teapot 16.9
breakfast 16.8
food 16.3
handle 16.2
kitchen 16.1
pitcher 15.2
oriental 15.1
close 14.8
coffee 14.8
tradition 14.8
ancient 14.7
ceremony 14.6
beverage 14.5
single 14
bowl 13.9
jug 13.4
hot 13.4
jar 12.7
black 12.6
yellow 12.6
decorative 12.5
craft 12.4
antique 12.1
healthy 12
nobody 11.7
herbal 11.5
empty 11.2
retro 10.7
porcelain 10.6
heat 10.2
color 10
refreshment 10
morning 9.9
vintage 9.9
saucer 9.7
caffeine 9.7
metal 9.7
classical 9.6
water 9.3
east 9.3
kitchenware 8.8
closeup 8.8
flower 8.5
domestic 8.1
natural 8
lid 7.9
fresh 7.8
liquid 7.8
table 7.8
glass 7.8
serving 7.7
mug 7.7
spoon 7.6
relaxation 7.5
aroma 7.5
style 7.4
ornate 7.3
diet 7.3

Google
created on 2018-12-21

Microsoft
created on 2018-12-21

black 82.6
white 75.1
plant 71.9
ceramic ware 59.1
stoneware 39.8
pottery 5.8
porcelain 5.8
ceramic 2.2
vase 0.3

Captions

Microsoft

a black and white photo of a vase 62.4%
a white vase on a table 60.2%
black and white photo of a vase 53.5%