Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2735

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2735

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Bread 97.1
Food 97.1
Apparel 96.5
Clothing 96.5
Ornament 67.2
Accessory 66.4
Accessories 66.4
Archaeology 55.3

Clarifai
created on 2019-03-26

no person 99.1
one 94.8
grow 94.5
food 94
still life 93.2
cutout 88.1
wear 83.8
retro 83.6
art 77.1
desktop 76
container 75.9
people 71.8
old 71.8
painting 71.5
color 71.3
meat 71
shape 70.7
invertebrate 70.1
vintage 69
texture 68.9

Imagga
created on 2019-03-26

footwear 84.5
sock 61.9
hosiery 49.3
covering 39.4
clothing 35.5
clog 31.4
shell 21.7
fashion 21.1
pillow 17.4
boot 16.2
foot 14.3
style 14.1
brown 14
shoe 13.4
cowboy boot 13
consumer goods 12.7
shoes 12.5
leather 12.3
cushion 12.1
feet 11.6
pair 10.4
pattern 10.3
bread 10.2
old 9.7
interior 9.7
health 9.7
body 9.6
color 9.5
legs 9.4
design 9.3
new 8.9
healthy 8.8
food 8.5
texture 8.3
stylish 8.1
women 7.9
boots 7.8
leg 7.8
elegant 7.7
wear 7.7
padding 7.6
skin 7.6
feminine 7.5
closeup 7.4
retro 7.4
inside 7.4
home 7.2
modern 7

Google
created on 2019-03-26

earthenware 66.4
Artifact 66.1

Microsoft
created on 2019-03-26

slice 69.3
half 44
eaten 21.7
art 21.7
bowl 10.9
slipper 5.6

Color Analysis

Feature analysis

Amazon

Bread 97.1%

Categories

Imagga

food drinks 100%

Captions

Microsoft
created on 2019-03-26

a piece of bread 80.1%
a piece of bread on a plate 57.1%
a half eaten apple 55.1%

Text analysis

Amazon

726.30