Human Generated Data

Title

Arretine Bowl (reproduction)

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of MUSEUM OF FINE ARTS, THROUGH LACEY D. CASKEY, ESQ., 1933.38.3

Human Generated Data

Title

Arretine Bowl (reproduction)

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Plant 97.7
Bread 94.4
Food 94.4
Produce 85.3
Vegetable 84.8
Cushion 79.8
Pumpkin 71
Pottery 67.7
Gourd 63.6
Jar 63.4
Seed 59.3
Grain 59.3

Imagga
created on 2022-01-29

carving 77.7
sculpture 65.2
earthenware 57.2
plastic art 46.2
ceramic ware 43
food 33.6
utensil 28.1
art 26.1
brown 21.3
meat 18.9
raw 18.7
meal 17
fresh 17
figure 16.8
tasty 16.7
texture 16.7
slice 16.4
gourmet 16.1
butternut squash 16
bread 15.7
ingredient 14.9
close 14.8
squash 14.4
breakfast 14.1
fat 14
vegetable 13.9
orange 13.8
pumpkin 13.7
cuisine 13.3
winter squash 13.3
pastry 13.2
diet 12.9
snack 12.8
dinner 12.6
loaf 12.6
fillet 12.6
nobody 12.4
closeup 12.1
sweet 11.8
eating 11.8
delicious 11.5
steak 11.5
preparation 11.5
bakery 11.4
healthy 11.3
baked 11.2
freshness 10.8
natural 10.7
ottoman 10.7
salmon 10.6
section 10.6
pattern 10.2
fish 10.1
cooking 9.6
lunch 9.4
wooden 8.8
decoration 8.7
wheat 8.6
seat 8.4
nutrition 8.4
color 8.3
traditional 8.3
grain 8.3
backgrounds 8.1
market 8
yellow 7.9
autumn 7.9
colorful 7.9
uncooked 7.8
crust 7.7
culture 7.7
old 7.7
prepared 7.6
seafood 7.6
eat 7.5
single 7.4
light 7.3
object 7.3
fall 7.2
cut 7.2
paper 7.1
dessert 7
life 7

Google
created on 2022-01-29

Amber 81.6
Wood 81.5
Artifact 79.9
Natural material 74.2
Tints and shades 71.9
Creative arts 71.5
Art 68.9
Peach 62.9
Carmine 61.8
Font 58.6
Rock 53.9
Electric blue 53.5

Microsoft
created on 2022-01-29

orange 70
tan 55.1
ceramic ware 26.1

Feature analysis

Amazon

Bread 94.4%