Human Generated Data

Title

Arretine Bowl (reproduction)

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of James Loeb, 1910.30

Human Generated Data

Title

Arretine Bowl (reproduction)

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Bowl 100
Mixing Bowl 99.7
Soup Bowl 88.3
Beverage 79.3
Milk 79.3
Drink 79.3

Imagga
created on 2022-01-29

bowl 95.5
mixing bowl 70.5
container 61.3
earthenware 58.3
ceramic ware 46.6
cup 45.4
utensil 41.7
tea 32
china 27
drink 25.9
tableware 25.4
soup bowl 23.5
dish 22.8
kitchen 19.7
object 19.1
beverage 18.6
pottery 18.1
pot 16.9
ceramic 16.5
food 16.3
porcelain 16
breakfast 15.9
hot 15.9
saucer 14.6
healthy 14.5
glass 14.1
vessel 13.7
herbal 13.4
color 13.4
traditional 13.3
black 13.2
close 13.1
empty 12.9
teacup 12.8
ceramics 12.7
nobody 12.4
old 11.9
closeup 11.5
brown 11
plate 11
decoration 10.9
natural 10.7
single 10.7
cooking 10.5
liquid 10.4
coffee 10.2
gourmet 10.2
art 9.8
clay 9.8
mug 9.6
vase 9.6
restaurant 9.5
luxury 9.4
heat 9.3
fresh 9.2
refreshment 9.1
gold 9.1
morning 9
health 9
diet 8.9
culture 8.6
flower 8.5
design 8.4
taste 8.3
tradition 8.3
warm 8.3
jug 7.9
teapot 7.9
antique 7.8
ceremony 7.8
tasty 7.5
aroma 7.5
rich 7.5
water 7.3
yellow 7.3
shiny 7.1
silver 7.1
leaf 7

Google
created on 2022-01-29

Tableware 97
Dishware 93.5
Mixing bowl 89.3
Serveware 85.5
Bowl 83.9
Pottery 83.7
Creative arts 83.5
Artifact 82
Porcelain 81.3
Natural material 80.3
earthenware 79.8
Art 77.1
Ceramic 73
Circle 71.4
Drinkware 69.1
Metal 66.4
Glass 55.7
Rectangle 54.6

Microsoft
created on 2022-01-29

cup 99.9
table 98.5
tableware 92.3
plant 90.6
bowl 88.5
ceramic 87.9
indoor 86.8
vase 74.9
earthenware 71.8
dishware 70.4
museum 62
pottery 57.3
ceramic ware 47
half 42.1
stoneware 39.7
close 26.7
porcelain 5.3

Feature analysis

Amazon

Milk 79.3%

Captions

Microsoft

a close up of a bowl 84.8%
a close up of a glass bowl 77.4%
a close up of a bowl on a table 75.8%