Human Generated Data

Title

Jug

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Richard Norton, 1909.39

Human Generated Data

Title

Jug

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Jug 98
Bread 97.5
Food 97.5
Pottery 74.5
Bronze 57.2

Imagga
created on 2022-01-29

vase 100
vessel 93.3
jar 90.1
container 63
brown 28
jug 24
bread 23
food 22.7
pitcher 20.6
bottle 20.1
loaf 19.4
meal 17.8
baked 16.9
breakfast 16.8
wheat 14.5
crust 14.5
bakery 14.3
old 13.9
grain 13.8
ancient 13.8
flour 13.6
water jug 13.4
traditional 13.3
delicious 13.2
close 12.6
drink 12.5
closeup 12.1
lunch 12
whiskey jug 11.9
eating 11.8
sculpture 11.7
object 11.7
tasty 11.7
antique 11.4
culture 11.1
nobody 10.9
cooking 10.5
fresh 10.5
art 10.2
gourmet 10.2
healthy 10.1
beverage 9.9
kitchen 9.8
decoration 9.5
natural 9.4
yellow 9.3
dinner 9.3
carving 8.9
glass 8.7
earthenware 8.6
snack 8.5
eat 8.4
pottery 8.4
wood 8.3
alcohol 8.3
gold 8.2
slice 8.2
history 8
wooden 7.9
textured 7.9
liquid 7.8
bun 7.8
cereal 7.7
whole 7.6
stone 7.6
bubble 7.5
ceramic ware 7.5
dry 7.4
refreshment 7.3
dirty 7.2
black 7.2

Google
created on 2022-01-29

Vase 89.6
Pottery 83.9
Creative arts 82.7
Artifact 82.6
Art 80.1
earthenware 79.8
Serveware 76.9
Ceramic 74.6
Metal 67.3
Drinkware 67
Antique 63.6
History 57.5
Natural material 57.4
Peach 52
Clay 50.5

Microsoft
created on 2022-01-29

vessel 84
artifact 77.1
sculpture 75.1
plant 61.6
jar 61.5
statue 59.7
ceramic ware 58.8
stoneware 27.6
stone 15.7

Feature analysis

Amazon

Bread 97.5%

Captions

Microsoft

a vase sitting on a table 47.6%
a vase sitting on top of a table 43.6%
a brown vase on a table 43.5%