Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2776

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2776

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 92.2
Apparel 92.2
Bread 89.9
Food 89.9
Slate 81.4

Clarifai
created on 2023-10-27

no person 99.9
one 98.8
art 98.4
wear 95.1
retro 94.4
ancient 93.9
prehistoric 92.1
sculpture 91.6
simplicity 91.3
people 91.1
architecture 89
old 88.7
rough 87.4
two 85.3
stone 84.2
paper 84
dirty 83.4
creativity 82.7
travel 81.9
industry 81.3

Imagga
created on 2022-01-23

holster 100
sheath 100
protective covering 77.5
covering 53.3
food 23.7
close 16.5
brown 15.4
delicious 14.9
orange 13.8
gourmet 13.6
sweet 13.4
yellow 13.2
fat 13
eat 12.6
snack 12
slice 11.8
tile 11.8
tasty 11.7
diet 11.3
chocolate 10.9
dessert 10.7
fresh 10.5
texture 10.4
closeup 10.1
healthy 10.1
meat 9.9
meal 9.7
piece 9.5
color 9.4
earthenware 9.3
eating 9.2
breakfast 8.8
ingredient 8.8
decoration 8.7
sugar 8.5
wood 8.3
bar 8.3
dinner 7.6
pattern 7.5
baked 7.5
bread 7.4
object 7.3
cook 7.3
fall 7.2
colorful 7.2
bright 7.1
cuisine 7.1

Google
created on 2022-01-23

Brown 98
Amber 84.2
Artifact 71.4
Fashion accessory 67.5
Peach 64.5
Carmine 64.3
Metal 63.7
Art 60.5
Rock 60.3
Illustration 54.2
Font 53.9
Mineral 52.7
Coquelicot 50.8
Wood 50.3

Microsoft
created on 2022-01-23

map 79.9
tan 51.6
orange 40.1

Color Analysis

Feature analysis

Amazon

Bread 89.9%

Categories

Imagga

food drinks 100%

Captions

Microsoft
created on 2022-01-23

a piece of paper 47.5%
a piece of wood 47.4%
a close up of a piece of paper 37.2%