Human Generated Data

Title

Jug

Date

3rd-4th century CE

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Mrs. Oric Bates, 1924.76

Human Generated Data

Title

Jug

People
Date

3rd-4th century CE

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Food 87.4
Pear 87.4
Fruit 87.4
Plant 87.4
Beverage 86.9
Milk 86.9
Drink 86.9
Relish 81.8
Pickle 71.9
Bottle 71
Jug 59

Imagga
created on 2021-12-14

bottle 100
beer bottle 100
vessel 100
container 84.9
drink 45.9
glass 45.9
alcohol 45.3
beer 38.9
pop bottle 37.4
liquid 35.7
beverage 32.6
cold 31
transparent 29.6
full 28.3
bar 25.9
refreshment 24.5
lager 23.8
brown 23.5
close 22.8
ale 22.6
party 21.5
drop 20.9
gold 20.5
water 20
yellow 19.2
object 19
bottles 18.6
foam 18.4
wet 17.9
light 17.4
cap 17.2
pub 16.6
wine bottle 16.6
single 16.4
fresh 16.3
bubbles 16.1
golden 15.5
food 15.2
froth 14.7
cool 14.2
ice 13.8
brewery 13.8
closeup 13.5
tasty 13.4
clean 12.5
freshness 12.5
bubble 12.2
amber 11.7
mug 11.5
healthy 11.3
color 11.1
wine 11.1
vodka 10.8
empty 10.5
clear 10.5
celebration 10.4
restaurant 10.3
still 10.1
nobody 10.1
tavern 9.9
booze 9.8
frosty 9.8
appetizing 9.6
refreshing 9.6
life 9.4
spirits 8.9
bottled 8.9
cheers 8.8
alcoholic 8.8
objects 8.7
bright 8.6
still life 8.5
health 8.3
one 8.2
repose 7.9
brew 7.8
draft 7.8
fluid 7.8
table 7.8
gourmet 7.6
oil 7.4
fruit 7.3
reflection 7.3
diet 7.3
black 7.2

Google
created on 2021-12-14

Tableware 95.6
Drinkware 94.7
Bottle 94.4
Liquid 91.1
Serveware 86.5
Fluid 85.6
Artifact 84.4
Drink 82.5
Glass bottle 81.4
Creative arts 77.9
Glass 74.4
Art 74.3
Metal 69.7
Vase 66.5
Pottery 64.1
Still life photography 59
Composite material 53.9
Liqueur 53.2
Gas 50.2

Microsoft
created on 2021-12-14

indoor 95
vessel 73
jar 28.9
bottle 27.4

Feature analysis

Amazon

Pear 87.4%
Milk 86.9%

Captions

Microsoft

a close up of a bottle 69.9%
a vase sitting on a table 41.1%
a vase sitting on top of a table 38.4%