Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2745

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2745

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Wood 85.6
Food 74.2
Bread 74.2
Brick 64.1
Plywood 58
Fossil 56.5
Archaeology 55.7

Clarifai
created on 2019-03-26

no person 99.7
one 98.5
food 95.2
wear 94.3
chocolate 91.3
still life 90.3
sugar 88.8
handmade 84.9
two 83.8
desktop 82.9
delicious 82.5
cutout 82.4
simplicity 81.6
candy 81.4
indoors 80.6
grow 80.3
dirty 78.9
retro 78.8
color 78.2
homemade 77.4

Imagga
created on 2019-03-26

glass 31.1
drink 27.5
beverage 24.5
cold 22.4
alcohol 21.5
binding 21.2
close 20
yellow 19.9
container 19.3
beer 18.4
brown 17.7
bar 17.5
bottle 16.1
refreshment 15.6
vessel 15.1
fresh 15
liquid 14.9
foam 14.5
lager 14.3
gold 13.1
party 12.9
food 12.8
closeup 12.8
pub 12.7
golden 12
ale 11.8
transparent 11.6
refreshing 11.5
sweet 11.1
wet 10.7
cool 10.6
ice 10.5
bubbles 10.4
paper 10.3
full 10
dessert 10
pint 9.9
froth 9.8
amber 9.8
texture 9.7
restaurant 9.5
healthy 9.4
bubble 9.4
holster 9.3
slice 9.1
health 9
celebration 8.8
vase 8.8
board 8.6
orange 8.5
juice 8.4
delicious 8.3
pattern 8.2
envelope 8.2
diet 8.1
cocktail 8
color 7.8
nutrition 7.5
sheath 7.5
tasty 7.5
freshness 7.5
support 7.3
drop 7.2

Google
created on 2019-03-26

Tan 88.5
Brown 81.2

Microsoft
created on 2019-03-26

orange 55.7
food 30.4
chocolate 18

Color Analysis

Feature analysis

Amazon

Bread 74.2%

Categories

Imagga

food drinks 100%

Captions

Microsoft
created on 2019-03-26

a close up of a box 65.5%
a piece of paper 42.3%
an orange box on a table 25.6%