Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2813

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2813

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Bread 99.5
Food 99.5
Apparel 96.9
Clothing 96.9
Helmet 96.7
Hardhat 96.7
Fossil 58.1
Soil 57.6

Clarifai
created on 2019-03-25

no person 99.1
pottery 97.4
container 96.6
one 96.5
food 95.2
desktop 94.4
ceramic 93.1
cutout 92.6
art 92.6
disjunct 89.1
color 88.5
hard 88.3
vintage 85.6
grow 85.3
texture 82.2
still life 81.7
dish 81.6
round out 81.3
stranded 81
old 80

Imagga
created on 2019-03-25

tile 72.2
food 40.9
sugar 31.7
earthenware 31.2
brown 29.4
chocolate 29.2
candy 28.6
snack 27.3
meal 24.3
dessert 23.6
ceramic ware 23.4
sweet 22.9
gourmet 22.9
tasty 22.6
delicious 22.3
close 22.2
piece 18.9
breakfast 18.5
diet 17.8
fat 17.7
eat 17.6
bread 16.6
utensil 15.3
nutrition 15.1
closeup 14.8
bakery 14.4
fresh 14.4
pastry 14.3
milk 14.3
baked 14
healthy 13.8
slice 13.6
cream 13.1
traditional 12.5
cake 12.2
eating 11.8
meat 11.7
nobody 11.7
cuisine 11.5
product 11.2
ingredient 10.7
dinner 10.1
dark 10
yellow 9.9
loaf 9.7
object 9.5
appetizer 9.4
lunch 9.4
single 9
cocoa 8.8
crust 8.7
life 8.6
culinary 8.6
plate 8.5
cut 8.1
cheese 8.1
delicatessen 7.8
restaurant 7.8
bake 7.7
old 7.7
treat 7.6
decoration 7.2
kitchen 7.2

Google
created on 2019-03-25

Orange 87.2
earthenware 73
Clay 57.5

Microsoft
created on 2019-03-25

food 17.5
orange 9.5
recipe 5
homemade 4.6
chocolate 3.6

Color Analysis

Feature analysis

Amazon

Bread 99.5%

Categories

Imagga

food drinks 99.7%

Captions

Microsoft
created on 2019-03-25

a piece of paper 46.7%
a piece of wood 46.6%
a piece of food 46.5%