Human Generated Data

Title

Jar

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Edward P. Bliss, 1916.287

Human Generated Data

Title

Jar

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Plant 89.6
Pottery 87.2
Jar 78.1
Vase 72.6
Jug 70.3
Pear 69.4
Food 69.4
Fruit 69.4
Produce 60.9

Imagga
created on 2022-01-29

whiskey jug 100
jug 100
vessel 93.1
vase 85.5
bottle 81.2
container 65.7
jar 61
food 22.9
brown 19.9
object 17.6
traditional 15.8
fruit 15.5
healthy 14.5
decoration 14.5
old 13.2
glass 13.2
organic 12.6
pot 12.5
pottery 12.3
close 12
yellow 11.9
fresh 11.8
nobody 11.7
antique 11.2
ancient 11.2
nutrition 10.9
decorative 10.9
delicious 10.7
loaf 10.7
craft 10.5
culture 10.3
bread 10.2
natural 10
freshness 10
vintage 9.9
history 9.8
art 9.8
meal 9.7
pear 9.7
diet 9.7
sweet 9.5
vegetable 9.4
orange 9.2
tasty 9.2
drink 9.2
breakfast 8.8
clay 8.8
closeup 8.8
pitcher 8.7
ornament 8.6
single 8.2
ripe 8.2
raw 8
terracotta 7.9
agriculture 7.9
ceramics 7.8
color 7.8
snack 7.7
gourmet 7.6
eating 7.6
baked 7.5
still 7.4
earthenware 7.3
juicy 7.3
vegetarian 7.1
dessert 7.1
ingredient 7
life 7

Google
created on 2022-01-29

Artifact 84.1
Vase 79.9
Bottle 76.7
Natural material 71.2
Art 70.5
Pottery 62.8
earthenware 57.3

Microsoft
created on 2022-01-29

jar 94.2
vessel 90.8
ceramic ware 72.1
porcelain 5.7

Feature analysis

Amazon

Pear 69.4%

Captions

Microsoft

a close up of a vase 42.3%
a tall vase 38.4%