Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2773

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2773

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Pottery 94
Food 86.5
Bread 86.5
Fossil 58.2

Clarifai
created on 2019-03-26

no person 98.5
food 96.8
pottery 95
desktop 94.6
one 93.9
slice 93.1
container 90.7
color 90.6
ceramic 90.5
grow 90.3
art 88.4
round out 88
closeup 86
plate 84.2
stranded 83.7
disjunct 82.8
dish 81.6
cut 81.4
healthy 81.4
round 80.9

Imagga
created on 2019-03-26

earthenware 53.2
food 43.4
candy 42.8
ceramic ware 40
conserve 37.3
sweet 28.4
chocolate 27.8
dessert 27.5
utensil 27.1
sugar 26.8
brown 25.8
delicious 24.8
tasty 23.4
eat 22.6
snack 22.2
gourmet 20.4
healthy 19.5
breakfast 18.6
nutrition 16.8
fresh 16.3
diet 16.2
orange 16.1
meal 15.4
close 15.4
cream 14
fat 14
taste 13.9
plate 13.6
ingredient 13.2
fruit 13.2
closeup 12.8
slice 12.7
dinner 12.6
object 12.5
pastry 12.4
milk 12.4
bread 11.2
lunch 11.1
freshness 10.8
honey 10.5
cake 10.3
cocoa 10.2
glass 10.1
eating 10.1
raw 9.8
yummy 9.6
sauce 9.5
jelly 9.5
color 9.5
yellow 9.3
ripe 9.1
container 9
black 9
cuisine 8.9
creamy 8.8
jar 8.8
loaf 8.7
cooking 8.7
natural 8.7
health 8.3
kitchen 8
jam 7.8
temptation 7.7
edible 7.6
texture 7.6
bakery 7.6
spoon 7.6
treat 7.6
product 7.5
baked 7.5
traditional 7.5
syrup 7.4
dish 7.2
cut 7.2
tea 7.1

Google
created on 2019-03-26

Orange 93
earthenware 92.3
Pottery 77.4
Ceramic 77.3
Dishware 57.4
Plate 56.1
Clay 51.9
Art 50.2

Microsoft
created on 2019-03-26

orange 85.5
ceramic ware 61.9
chocolate 61.9
cake 55.7
food 44.7
dessert 28.9
bakery 15.6

Color Analysis

Feature analysis

Amazon

Bread 86.5%

Categories

Imagga

food drinks 100%

Captions