Human Generated Data

Title

Stamped Amphora Handle

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3249

Human Generated Data

Title

Stamped Amphora Handle

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3249

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Food 99.9
Bread 99.9
Plant 79.2
Peel 59.9
Figurine 55.4
Sweets 55.4
Confectionery 55.4

Clarifai
created on 2019-07-07

no person 99.7
food 98.7
still life 97.2
grow 93.6
one 92.6
cutout 92.5
sweet 89.1
two 88.7
candy 87.5
desktop 87.3
pile 86
texture 85.3
three 85.3
several 83.6
group 83.5
color 82.4
stranded 81.6
confection 80.6
delicious 80.5
container 80.1

Imagga
created on 2019-07-07

tamarind 81
edible fruit 66.7
fruit 43
produce 31.7
food 30.1
tool 22.6
tile 18.1
scraper 17.9
metal 17.7
hand tool 16.6
close 14.3
closeup 14.1
detail 12.9
snack 12.8
old 12.5
brown 12.5
steel 12.4
flatworm 12.4
animal 12.3
texture 10.4
tasty 10
worm 9.9
object 9.5
sweet 9.5
construction 9.4
invertebrate 9.2
eat 9.2
repair 8.6
treat 8.6
fresh 8.5
baked 8.4
hand 8.3
meal 8.1
yellow 7.9
dessert 7.9
wrench 7.9
industry 7.7
equipment 7.6
rusty 7.6
wood 7.5
device 7.5
delicious 7.4
bread 7.4
slice 7.3
diet 7.3
bicycle seat 7.1
work 7.1

Google
created on 2019-07-07

Clay 61

Microsoft
created on 2019-07-07

piece 89.8
slice 79.8
half 47.8
bread 42.7
eaten 35.3

Color Analysis

Feature analysis

Amazon

Bread 99.9%

Categories

Imagga

food drinks 99.2%

Captions

Microsoft
created on 2019-07-07

a piece of bread 80.8%
a piece of food 77.1%
a slice of bread 70.6%