Human Generated Data

Title

Bowl

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.142

Human Generated Data

Title

Bowl

People
Date

Classification

Vessels

Machine Generated Data

Tags

Imagga
created on 2018-12-21

bowl 100
container 72.4
soup bowl 65.8
cup 50.9
mixing bowl 46.5
dish 43.3
drink 36.8
vessel 33
tea 32.2
china 28.8
tableware 28.4
mortar 27.9
hot 26
beverage 24.1
saucer 21.3
coffee 21.3
breakfast 21.2
food 19.3
pot 19.3
black 18
utensil 17.9
kitchen 17.9
mug 17.4
porcelain 17.3
morning 17.2
traditional 16.6
brown 16.2
ceramic 15.5
herbal 15.3
liquid 14.8
healthy 14.5
yellow 13.9
pestle 13.9
pottery 13.8
close 13.7
spoon 13.3
teapot 13
culture 12.8
teacup 12.8
refreshment 12.7
ceremony 12.6
relaxation 12.6
cooking 12.2
aroma 12.2
restaurant 12.1
health 11.8
ceramics 11.7
ceramic ware 11.7
oriental 11.3
heat 11.1
kitchenware 11.1
object 11
earthenware 10.9
spa 10.8
natural 10.7
plate 10.2
decoration 10.1
relax 10.1
caffeine 9.7
table 9.5
color 9.5
empty 9.5
closeup 9.4
therapy 9.4
water 9.3
tradition 9.2
hand tool 8.8
espresso 8.7
glass 8.7
lifestyle 8.7
lunch 8.6
nobody 8.6
gourmet 8.5
stone 8.4
warm 8.3
metal 8
ingredient 7.9
crockery 7.9
fresh 7.8
brew 7.8
tool 7.8
cups 7.8
serving 7.7
aromatic 7.7
flower 7.7
old 7.7
handle 7.6
break 7.6
dinner 7.6
single 7.4
treatment 7.3
meal 7.3
art 7.2

Google
created on 2018-12-21

Microsoft
created on 2018-12-21

indoor 96
black 71.2
tableware 66.8
dishware 30.9
cup 28.7
ceramic ware 24.5
bowl 18.2
pottery 18.2
ceramic 8.6

Captions

Microsoft

a vase sitting on a table 36.8%
a vase sitting on top of a wooden table 28.2%
a black and white photo of a bowl 28.1%