Human Generated Data

Title

Stamped Amphora Handle

Date

160-230 CE

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Dr. A. S. Pease, 1977.216.1910

Human Generated Data

Title

Stamped Amphora Handle

Date

160-230 CE

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Dr. A. S. Pease, 1977.216.1910

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Soil 99.3
Archaeology 95.4
Bread 77.7
Food 77.7
Fossil 57.3

Clarifai
created on 2019-07-07

no person 99.6
still life 96.1
one 95.7
food 90.8
prehistoric 90.5
broken 88.3
cutout 87.2
sculpture 84.6
geology 81.9
clay 81.4
two 81
art 77.7
rock 77.5
archaeology 76.6
stone 75.2
cube 75.2
hard 74.8
energy 73
round out 71.2
stranded 69.8

Imagga
created on 2019-07-07

food 37.5
tile 35.6
device 27.8
brake pad 26.6
bread 25.9
restraint 24.5
slice 23.7
brown 23.6
chocolate 22
tasty 21.7
snack 21.4
corbel 20.1
toast 19.4
delicious 19
close 18.8
bakery 18.1
cut 17.9
baked 16.9
sweet 16.6
bracket 16.1
eat 15.9
loaf 15.5
meal 15.4
diet 15.4
breakfast 15
dessert 15
pastry 14.2
healthy 13.9
tool 13.8
sugar 13.8
closeup 13.5
piece 13.3
cake 13.2
fresh 13.1
detail 12.9
nutrition 12.6
old 12.5
kitchen 12.5
support 12.1
wheat 11.4
eating 10.9
candy 10.4
gourmet 10.2
grain 10.2
crust 9.7
metal 9.7
object 9.5
treat 9.5
natural 9.4
nobody 9.3
yellow 9.3
steel 8.8
home 8.8
whole 8.6
construction 8.6
dinner 8.4
iron 8.4
dark 8.4
clog 8
rye 7.8
bake 7.7
rusty 7.6
organic 7.6
wood 7.5
cream 7.5
footwear 7.4
black 7.2

Google
created on 2019-07-07

Artifact 80.3
Rock 75.9
Clay 68.5
earthenware 51

Microsoft
created on 2019-07-07

piece 84.5
half 71.7
slice 70.9
eaten 46.2

Color Analysis

Feature analysis

Amazon

Bread 77.7%

Categories

Captions

Text analysis

Amazon

MEeTestacio

Google

Mte.Testaceio
Mte.Testaceio