Human Generated Data

Title

Lead Token

Date

1 - 200 CE

People

-

Classification

Tokens

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Alice Corinne McDaniel Collection, Department of the Classics, Harvard University, 2008.116.32

Human Generated Data

Title

Lead Token

Date

1 - 200 CE

Classification

Tokens

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Alice Corinne McDaniel Collection, Department of the Classics, Harvard University, 2008.116.32

Machine Generated Data

Tags

Amazon
created on 2022-05-07

Accessories 99
Accessory 99
Gemstone 98.9
Jewelry 98.9
Ornament 96.6
Agate 93.9

Clarifai
created on 2023-10-17

iron 99.4
no person 99.4
metalwork 98.4
stone 98.4
sculpture 98
rusty 97.5
one 96.5
desktop 96.5
hard 96.5
ancient 96.4
prehistoric 96.3
rust 95.6
old 95.3
disjunct 95.2
geology 95.2
side view 95.1
art 94.6
strong 93.9
rock 93.9
precious 93.7

Imagga
created on 2022-05-07

fastener 75.2
restraint 56.3
nut and bolt 55.4
screw 32.9
device 27.6
food 23
close 21.1
meal 19.5
brown 16.9
delicious 16.5
black 15.6
gourmet 15.3
object 14.6
snack 14.5
bakery 14.3
baked 14
dark 13.4
bread 13
stone 12.6
eating 12.6
mineral 12.6
healthy 12.6
tasty 12.5
sweet 11.8
slice 11.8
loaf 11.6
breakfast 11.5
diet 11.3
chocolate 11.3
dessert 11.2
fresh 11.1
nutrition 10.9
closeup 10.8
lunch 10.3
dinner 10.1
cut 9.9
kitchen 9.8
candy 9.7
pastry 9.5
meat 9
color 8.9
beef 8.8
crust 8.7
natural 8.7
nobody 8.5
piece 8.5
cake 8.5
iron 8.4
fat 8.4
eat 8.4
detail 8
cuisine 8
objects 7.8
bake 7.7
roast 7.7
whole 7.6
organic 7.6
gold 7.4
tool 7.4
gem 7

Google
created on 2022-05-07

Color Analysis

Categories

Imagga

paintings art 57.4%
food drinks 40.6%

Captions

Microsoft
created on 2022-05-07

a piece of paper 32.7%
a piece of wood 32.6%

Text analysis

Amazon

cm

Google

cm
cm