Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2781

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2781

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Meal 92.1
Dish 92.1
Food 92.1
Ketchup 91.6
Pottery 88.3
Clothing 73.9
Helmet 73.9
Hardhat 73.9
Apparel 73.9
Potted Plant 57.4
Jar 57.4
Vase 57.4
Plant 57.4

Clarifai
created on 2019-03-26

no person 98.6
wear 95.9
pottery 94.2
helmet 94
architecture 92.7
container 91.7
one 91.5
expression 89.5
food 89.3
building 89
plastic 88.6
empty 88.3
cutout 84.8
round out 84.6
vintage 83.5
paper 83.3
retro 82.9
art 82.2
dirty 82
design 81.9

Imagga
created on 2019-03-26

earthenware 38.4
container 33.4
bag 31.9
ceramic ware 28.9
food 21.1
tile 19.3
utensil 18.9
sweet 18.2
brown 17.6
baseball glove 17
object 16.8
orange 16.2
mailbag 15.7
glass 15.5
dessert 12.4
fresh 12.4
close 12
sugar 12
conserve 12
healthy 12
nutrition 11.7
single 11.5
ingredient 11.5
fruit 11.2
fat 11.2
eat 10.9
freshness 10.8
leather 10.4
health 10.4
taste 10.2
yellow 9.9
delicious 9.9
chocolate 9.9
breakfast 9.7
empty 9.6
color 9.4
snack 9.4
jar 8.9
meal 8.9
liquid 8.7
decoration 8.7
travel 8.4
product 8.4
drink 8.3
fall 8.1
drum 8.1
diet 8.1
water 8
nobody 7.8
vessel 7.7
gourmet 7.6
accessory 7.6
eating 7.6
bottle 7.5
shape 7.4
bar 7.4
light 7.3
refreshment 7.2
pumpkin 7.2
metal 7.2
beverage 7.2
bright 7.1
life 7

Google
created on 2019-03-26

Orange 90.8
Helmet 72.8

Microsoft
created on 2019-03-26

orange 83.2
ceramic ware 20.4
chocolate 20.4
food 5.5

Color Analysis

Feature analysis

Amazon

Ketchup 91.6%

Categories

Imagga

food drinks 100%

Captions