Human Generated Data

Title

Sherd from the Bottom of a Vessel

Date

918 CE-1392

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.58

Human Generated Data

Title

Sherd from the Bottom of a Vessel

Date

918 CE-1392

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.58

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Axe 99.5
Tool 99.5
Cushion 94.3
Soil 77.2
Pottery 65.1
Pillow 63

Clarifai
created on 2019-07-07

no person 99.5
cutout 96.1
stranded 94.1
food 93
desktop 92.8
texture 91.5
disjunct 91.2
one 88.8
old 86.9
color 86.2
wear 85.6
ancient 81
nature 80.6
paper 80.3
still life 78.9
pattern 78.6
retro 78.1
closeup 77.4
isolate 77.1
broken 76.8

Imagga
created on 2019-07-07

tile 51.6
pillow 42.3
cushion 31
brown 24.3
padding 20.5
object 19
paper 18.8
blank 16.8
old 16
food 15.9
texture 15.3
grunge 14.5
breakfast 14.1
vintage 14.1
bread 13.9
close 12
nobody 11.7
corbel 11.5
meal 11.4
bag 11.3
yellow 11.3
antique 11.2
page 11.1
decoration 10.9
tasty 10.9
aged 10.9
closeup 10.8
objects 10.4
container 10.4
book 10.3
design 10.2
healthy 10.1
eat 10.1
rough 10
material 9.8
retro 9.8
manuscript 9.8
decor 9.7
wheat 9.5
snack 9.4
bracket 9.2
slice 9.1
toast 9
detail 8.8
torn 8.7
natural 8.7
light 8.7
ancient 8.6
color 8.3
cover 8.3
grain 8.3
delicious 8.3
pattern 8.2
dirty 8.1
textured 7.9
art 7.9
loaf 7.8
crumpled 7.8
leather 7.7
piece 7.7
cut out 7.6
eating 7.6
wallet 7.5
dark 7.5
one 7.5
support 7.4
device 7.3
fabric 7.3
copy space 7.2
black 7.2
soft 7.2
fresh 7.2

Google
created on 2019-07-07

Clay 83.8
Rock 73.4
earthenware 73
Beige 70.8
Ceramic 65.2
Pottery 63
Artifact 62.7

Microsoft
created on 2019-07-07

stone 63.7
chocolate 57.6
half 39
eaten 25.4

Color Analysis

Feature analysis

Amazon

Axe 99.5%

Captions