Human Generated Data

Title

Knidian Stamped Amphora Handle

Date

2nd century BCE

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.3038

Human Generated Data

Title

Knidian Stamped Amphora Handle

Date

2nd century BCE

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.3038

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Bread 95.2
Food 95.2
Cushion 84.4
Tool 72.7
Axe 72.7

Clarifai
created on 2019-07-07

no person 99.5
one 94.9
still life 94.3
texture 87.1
art 87.1
rock 85.8
cutout 83.2
food 83.1
two 82.6
industry 81.6
container 81.5
old 79.4
desktop 79.3
color 76.8
furniture 75.8
hard 75.3
shape 74.9
fabric 74.8
energy 74.6
ancient 73.2

Imagga
created on 2019-07-07

invertebrate 33
food 32
earthenware 27.9
clog 24.4
tile 24.2
snack 23.9
animal 23
brown 22.1
ceramic ware 21
delicious 19.8
covering 19.5
meal 19.5
footwear 19.5
slice 19.1
diet 18.6
bread 17.6
breakfast 16.8
close 16
eat 15.9
object 14.7
organism 14.4
wheat 14.3
fresh 13.7
utensil 13.7
tasty 13.4
bakery 13.4
healthy 12.6
nutrition 12.6
sweet 11.9
holster 11.8
piece 11.8
eating 11.8
nobody 11.7
fat 11.2
lunch 11.1
grain 11.1
candy 11
gourmet 11
toast 10.9
loaf 10.7
dessert 10.6
pastry 10.4
baked 10.3
paper 10.2
sugar 9.6
sheath 9.5
closeup 9.4
yellow 9.3
dinner 9.3
cooking 8.7
objects 8.7
chocolate 8.4
horizontal 8.4
heart 8.3
detail 8
love 7.9
blank 7.7
old 7.7
treat 7.6
traditional 7.5
protective covering 7.3
valentine 7.3
color 7.2

Google
created on 2019-07-07

Artifact 78
Clay 73.7
earthenware 70.1
Ceramic 58.5
Pottery 55.7
Rock 54.2

Microsoft
created on 2019-07-07

half 35.1
eaten 22.4

Color Analysis

Feature analysis

Amazon

Bread 95.2%
Axe 72.7%

Categories

Captions

Microsoft
created on 2019-07-07

a piece of wood 67.3%