Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2754

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2754

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Bread 98.6
Food 98.6
Armor 58
Apparel 57.4
Clothing 57.4

Clarifai
created on 2019-03-26

no person 99.6
one 97.6
container 94.8
still life 92
food 91.8
wear 91.1
grow 90.6
painting 90.3
art 89.2
cutout 88.4
pottery 83.3
paper 80
desktop 78.1
retro 77.8
text 75.7
color 74.7
old 74.6
vintage 74.2
pattern 71
invertebrate 70.3

Imagga
created on 2019-03-26

bag 28
container 26
holster 25.2
mailbag 21.5
paper 21.2
sheath 20.2
brown 18.4
old 18.1
texture 18
protective covering 17.7
vintage 16.5
blank 16.4
antique 15.7
retro 15.6
page 14.8
grunge 14.5
dirty 14.4
empty 13.7
close 13.7
yellow 13.2
covering 12.7
frame 12.5
food 12.4
closeup 12.1
book 11.8
aged 11.8
object 11
earthenware 10.5
detail 10.4
sheet 10.3
drink 10
cardboard 9.6
design 9.6
ancient 9.5
glass 9.3
tile 8.6
art 8.5
honey 8.3
bar 8.3
note 8.3
letter 8.2
gold 8.2
pattern 8.2
rough 8.2
diet 8.1
leather 8
ceramic ware 7.9
shabby 7.8
crumpled 7.8
scrapbook 7.8
golden 7.7
parchment 7.7
freshness 7.5
delicious 7.4
style 7.4
cover 7.4
slice 7.3
border 7.2

Google
created on 2019-03-26

Orange 88.1
Leather 53.9

Microsoft
created on 2019-03-26

accessory 40.7
orange 40.7
abstract 36.3
bowl 24.1
food 21.2
art 20.9

Color Analysis

Feature analysis

Amazon

Bread 98.6%

Categories

Imagga

food drinks 100%

Captions

Microsoft
created on 2019-03-26

a piece of wood 43.6%