Human Generated Data

Title

Stamped Amphora Handle

Date

4th-1st century BCE

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.2946

Human Generated Data

Title

Stamped Amphora Handle

Date

4th-1st century BCE

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.2946

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Bread 98
Food 98
Rock 96.9
Cushion 85.8

Clarifai
created on 2019-07-07

no person 99.4
food 94.6
still life 92.3
one 87.6
stranded 86
energy 85.9
texture 84.1
industry 81.9
hard 79.7
pile 79.2
cutout 77
broken 76.6
two 75
desktop 74.8
isolated 74.1
rock 70.7
many 69.7
stone 69.7
people 66.2
geology 65.1

Imagga
created on 2019-07-07

whistle 82.5
acoustic device 58.7
signaling device 57.7
device 49.7
plug 40.9
bread 36.1
brown 36.1
food 33.9
tile 32.6
loaf 29.2
meal 27.6
breakfast 23.9
tasty 23.4
slice 22.8
crust 20.3
healthy 20.2
close 20
baked 19.7
snack 18.8
eating 18.5
bakery 18.1
wheat 18.1
organic 16.8
fresh 16.4
diet 16.2
kitchen 15.2
sweet 15
delicious 14.9
closeup 14.2
natural 14.1
nobody 14
cut 13.5
whole 13.4
piece 13.3
grain 12.9
gourmet 12.7
flour 12.7
nutrition 12.6
dessert 12.5
chocolate 12.2
eat 11.7
cereal 11.6
sugar 11.6
ingredient 11.4
candy 11
toast 10.7
bake 10.6
sliced 10.6
pastry 10.4
traditional 10
baker 9.8
bun 9.7
detail 9.7
unhealthy 9.6
objects 9.6
freshness 9.2
rye 8.8
object 8.8
nutritious 8.5
dinner 8.4
wind instrument 8.3
shell 8.2
yellow 8
color 7.8
rubber eraser 7.8
lunch 7.7
milk 7.6
treat 7.6
studio 7.6
dark 7.5
life 7

Google
created on 2019-07-07

Rock 73.4
Artifact 69.2
Clay 53.9

Microsoft
created on 2019-07-07

piece 86.9
slice 71.5
bread 56.4
half 35.8
eaten 22.6

Color Analysis

Feature analysis

Amazon

Bread 98%

Categories

Captions