Human Generated Data

Title

Bottle

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Edward P. Bliss, 1916.305

Human Generated Data

Title

Bottle

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Food 99.1
Pear 99.1
Plant 99.1
Fruit 99.1
Pottery 89.3
Jar 83.3
Vase 76.2
Bottle 72.5
Jug 58
Glass 55.6

Imagga
created on 2022-01-29

vase 100
jar 100
vessel 97.4
container 78.9
bottle 42.9
jug 26.7
glass 26.4
food 25.5
drink 23.4
brown 20.6
alcohol 19.5
whiskey jug 18.2
object 17.6
traditional 16.6
close 16.5
beverage 15.4
liquid 14.9
decoration 13
pottery 12.6
healthy 12.6
nobody 12.4
fresh 12.4
cold 12
drop 11.8
vegetable 11.7
tasty 11.7
pot 11.5
meal 11.3
bread 11.3
yellow 11.2
wine 11.1
full 11
refreshment 10.9
decorative 10.9
beer 10.7
cooking 10.5
ancient 10.4
organic 10.1
nutrition 10.1
fruit 9.9
single 9.9
antique 9.7
diet 9.7
closeup 9.4
gourmet 9.3
eating 9.2
pitcher 9.1
earthenware 9.1
old 9.1
gold 9
plant 9
transparent 8.9
history 8.9
kitchen 8.9
loaf 8.7
party 8.6
culture 8.5
art 8.5
bubble 8.5
eat 8.4
color 8.3
delicious 8.2
slice 8.2
natural 8
utensil 8
water 8
cuisine 8
breakfast 7.9
ceramic ware 7.8
ceramics 7.8
clay 7.8
restaurant 7.8
snack 7.7
orange 7.7
craft 7.6
vintage 7.6
bubbles 7.6
lager 7.5
freshness 7.5
light 7.3
ripe 7.2
black 7.2
celebration 7.2
ingredient 7
wooden 7

Google
created on 2022-01-29

Vase 94.2
Artifact 84.8
Art 84
Creative arts 83.4
Pottery 83.3
Serveware 81.2
earthenware 77.4
Ceramic 72.9
Bottle 68.9
Plant 65.3
Antique 61.5
Still life photography 58.4

Microsoft
created on 2022-01-29

vessel 95.9
jar 89.2
vase 86.2
statue 73.4
artifact 70.4
sculpture 70.4
ceramic ware 53
stoneware 21.1
stone 15.9

Feature analysis

Amazon

Pear 99.1%

Captions

Microsoft

a vase sitting on a table 36.1%
a vase sitting on top of a table 32.8%
a close up of a vase 32.7%