Human Generated Data

Title

Stamped Amphora Handle

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3245

Human Generated Data

Title

Stamped Amphora Handle

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3245

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Bread 100
Food 100
Pottery 90.9
Soil 76.6
Cushion 72
Archaeology 65
Vase 64.5
Jar 64.5
Plant 60

Clarifai
created on 2019-07-07

no person 99.7
food 97.3
breakfast 92
bread 91.2
delicious 91.1
still life 88.3
grow 86.9
one 86.8
stranded 83.2
cutout 83.1
slice 81.9
chocolate 80
sweet 78.9
baking 77.7
flour 77.7
nutrition 77.4
sugar 75.8
health 74.6
pastry 72.6
cooking 72.4

Imagga
created on 2019-07-07

toast 100
bread 46
food 38.5
brown 33.2
breakfast 30.1
slice 27.3
loaf 27.2
meal 24.4
healthy 24
snack 23.1
wheat 22
bakery 22
sugar 20.5
baked 19.7
diet 19.4
crust 19.4
organic 18.5
nutrition 17.6
close 17.1
tasty 16.7
grain 16.6
eating 16
tile 15.3
eat 15.1
delicious 14.9
sandwich 14.5
whole 14.4
closeup 14.2
baker 13.7
pastry 13.3
piece 13.3
fresh 13.1
natural 12.7
kitchen 12.5
nobody 12.5
lunch 12
gourmet 11.9
candy 11.9
rye 11.8
sweet 11.1
cereal 10.6
bake 10.6
sliced 10.6
dinner 10.1
dessert 9.9
flour 9.8
objects 9.6
product 9.4
fat 9.3
yellow 9.3
freshness 9.2
detail 8.9
object 8.8
paper 8.6
blank 8.6
brick 8.5
health 8.3
texture 8.3
stack 8.3
earthenware 8.3
butter 8.3
life 7.8
baking 7.8
old 7.7
milk 7.6
traditional 7.5
dirty 7.2
cut 7.2

Google
created on 2019-07-07

Rock 83.6
Clay 67.4

Microsoft
created on 2019-07-07

piece 90.9
slice 79.1
chocolate 72.8
bread 61.7
dessert 40.3

Color Analysis

Feature analysis

Amazon

Bread 100%

Categories

Imagga

food drinks 98.5%

Captions