Human Generated Data

Title

JUG

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Misses Norton, 1920.44.173

Human Generated Data

Title

JUG

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Jug 99.7
Water Jug 66.1
Pear 56.7
Food 56.7
Fruit 56.7
Plant 56.7

Imagga
created on 2022-01-29

vessel 100
pitcher 92
container 83.7
jug 58.6
vase 46.9
water jug 33.2
bottle 32.9
jar 29.2
brown 25.8
traditional 24.1
pot 22.3
antique 21.7
object 21.2
old 20.9
food 19.9
ancient 19.9
culture 19.7
pottery 17.7
decoration 17.4
handle 16.2
whiskey jug 15.7
history 15.2
craft 14.3
art 13.7
utensil 13.6
bread 12.9
ceramics 12.7
earthenware 12.7
clay 12.7
loaf 12.6
close 12.6
classical 12.4
china 12.4
porcelain 12.3
healthy 12
decorative 11.7
vintage 11.6
breakfast 11.5
retro 11.5
ornament 11.2
ornate 11
drink 10.9
ceramic ware 10.7
baked 10.3
nobody 10.1
teapot 10
terracotta 9.9
meal 9.7
metal 9.7
obsolete 9.6
closeup 9.4
delicious 9.1
cup 9
gold 9
kitchen 8.9
style 8.9
diet 8.9
cereal 8.7
glass 8.6
fresh 8.5
urn 7.9
life 7.8
ceramic 7.7
crust 7.7
snack 7.7
pastry 7.6
eating 7.6
tasty 7.5
natural 7.4
yellow 7.3
slice 7.3
sweet 7.1
textured 7

Google
created on 2022-01-29

Vase 91.9
Serveware 84.5
Creative arts 83.6
Artifact 81.7
Tableware 79.9
Pottery 79
Art 77.2
earthenware 75.2
Clay 70.4
Ceramic 65.8
Drinkware 65.3
Pitcher 64.7
History 58.5
Rock 57.5
Still life photography 57.2
Ancient history 56.8
Archaeology 53.8
Still life 52.3
Wood 50.9

Microsoft
created on 2022-01-29

stone 73.8
vessel 60.4
artifact 57.6
jar 47.6
ceramic ware 46.4

Feature analysis

Amazon

Pear 56.7%

Captions

Microsoft

a close up of a doughnut 43%
a close up of a donut 37.5%
a large doughnut 29%