Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2797

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2797

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Bowl 93.9
Plant 87.8
Pottery 78.2
Food 73.5
Bread 73.5
Vegetable 60
Ketchup 55.7

Clarifai
created on 2019-03-26

no person 99.8
food 98.6
one 98.1
container 98.1
pottery 97.9
grow 96.5
still life 94.8
tableware 93.8
cutout 93
earthenware 91.5
homemade 90.9
clay 90.5
fruit 90.1
round out 89.1
plate 88
traditional 87.9
dish 87.4
kitchenware 87.2
breakfast 87.1
simplicity 86.1

Imagga
created on 2019-03-26

earthenware 100
ceramic ware 85.5
utensil 56.8
food 40.6
fruit 28.1
orange 27.1
sweet 25.3
fresh 24.8
healthy 20.8
close 20.5
citrus 20.4
vitamin 19.5
juicy 17.3
brown 16.2
eating 16
organic 16
ripe 15.4
candy 15.4
eat 15.1
tasty 15
drink 15
breakfast 15
juice 14.8
object 14.7
tangerine 14.2
color 13.9
snack 13.7
freshness 13.3
diet 12.9
chocolate 12.9
gourmet 12.7
nutrition 12.6
sugar 12.6
dessert 12.4
vegetarian 11.6
natural 11.4
taste 11.1
tropical 11.1
slice 10.9
nobody 10.9
delicious 10.7
single 10.7
yellow 10.6
health 10.4
bread 10.3
mandarin 10.2
beverage 10.1
refreshment 10.1
peel 9.8
vegetable 9.7
circle 9.7
cake 9.4
baked 9.4
horizontal 9.2
studio 9.1
pumpkin 9.1
raw 8.9
ingredient 8.9
round 8.6
harvest 8.5
texture 8.3
autumn 7.9
seasonal 7.9
loaf 7.8
lunch 7.7
yummy 7.7
pastry 7.6
meal 7.3
fall 7.2
shiny 7.1

Google
created on 2019-03-26

Microsoft
created on 2019-03-26

orange 69.3
ceramic ware 44.4
chocolate 44.4
cake 37.5
dessert 16.2
food 15.9

Color Analysis

Feature analysis

Amazon

Bread 73.5%
Ketchup 55.7%

Categories

Imagga

paintings art 59.8%
food drinks 40%

Captions