Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2815

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.2815

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Food 98.8
Bread 98.8
Fossil 89.3

Clarifai
created on 2019-03-25

no person 99.4
food 97
cutout 89.9
one 89.7
paper 89.1
desktop 87.6
stranded 86.8
breakfast 86.6
grow 84.9
sweet 80.9
celebration 80.2
color 79.8
card 79.2
delicious 77.9
meal 77.2
wear 76.7
refreshment 75.3
cooking 74.9
container 74.6
sugar 73.1

Imagga
created on 2019-03-25

container 63.9
purse 45.3
wallet 40.7
bag 36
envelope 31.3
binding 28.6
case 28
box 25
money 19.6
paper 18.8
object 16.9
food 15.5
currency 15.3
brown 14.7
cash 14.6
finance 14.4
nobody 13.2
gift 12.9
business 12.8
card 11.9
fat 11.2
black 10.2
shopping 10.1
notebook 10.1
color 10
present 10
single 9.9
close 9.7
healthy 9.4
product 9.4
eating 9.3
old 9.1
package 9.1
wealth 9
financial 8.9
celebration 8.8
pay 8.6
snack 8.5
rich 8.4
pink 8.4
texture 8.3
freshness 8.3
banking 8.3
delicious 8.3
meal 8.1
meat 8.1
closeup 8.1
detail 8
decoration 8
design 7.9
cheese 7.6
gourmet 7.6
chocolate 7.6
leather 7.6
dinner 7.6
savings 7.5
dollar 7.4
birthday 7.4
carton 7.4
note 7.4
slice 7.3
diet 7.3
open 7.2
cut 7.2
bank 7.2
colorful 7.2

Google
created on 2019-03-25

Orange 95.7
Brown 82.5
Rock 64.4

Microsoft
created on 2019-03-25

food 34.8
accessory 34.8
orange 5.3
design 3.2
chocolate 2.9

Color Analysis

Feature analysis

Amazon

Bread 98.8%

Categories

Imagga

food drinks 99.9%

Captions

Microsoft
created on 2019-03-25

a piece of cake covered in snow 40.2%
a piece of paper 40.1%
a piece of cake on a table 40%