Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2756

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2756

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bread 97.2
Food 97.2
Clothing 84.9
Apparel 84.9
Fossil 79.9
Soil 64.2
Archaeology 60.1
Slate 59
Rock 55.2

Clarifai
created on 2023-10-27

no person 99.9
one 99.2
retro 98.3
art 98
wear 96.7
ancient 95.6
dirty 92.9
rough 91.9
architecture 89.1
old 88.3
stone 86.5
antique 84.2
interior design 84
pottery 83.7
prehistoric 82.9
people 82.8
rusty 82.4
painting 80.7
sculpture 80
simplicity 79.3

Imagga
created on 2022-01-23

earthenware 35.3
conserve 29.6
food 27.8
ceramic ware 26.5
meat 26.1
raw 23.2
dinner 21
texture 20.8
gourmet 20.4
steak 20.2
fat 18.6
fresh 18.3
fillet 17.4
utensil 17.3
cuisine 15.9
ingredient 15.9
candy 15.7
beef 14.6
salmon 13.5
eat 13.4
brown 13.2
cooking 13.1
fish 12.8
freshness 12.5
preparation 12.4
paper 12
close 12
grunge 11.9
slice 11.8
tasty 11.7
vintage 11.6
closeup 11.4
meal 11.4
cut 10.8
section 10.6
pork 10.5
diet 10.5
old 10.4
seafood 10.4
lunch 10.3
eating 10.1
cook 10.1
water 10
yellow 9.9
market 9.8
uncooked 9.7
fabric 9.7
design 9.6
piece 9.5
jelly 9.4
velvet 9.4
delicious 9.1
aged 9
dirty 9
pattern 8.9
healthy 8.8
restaurant 8.6
empty 8.6
pink 8.4
retro 8.2
backgrounds 8.1
butcher 7.9
sirloin 7.8
nobody 7.8
grill 7.7
plate 7.7
orange 7.7
wallpaper 7.7
antique 7.2
kitchen 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

orange 50.8

Color Analysis

Feature analysis

Amazon

Bread 97.2%

Categories

Imagga

food drinks 100%

Captions

Microsoft
created on 2022-01-23

map 91.2%