Human Generated Data

Title

Fragment from a Vessel

Date

People
Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University; deposited by Professor J. M. Paton, 1977.216.2123.453

Human Generated Data

Title

Fragment from a Vessel

People
Date

Classification

Fragments

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bread 100
Food 100
Fossil 98.5
Rock 94.9
Bronze 81.2
Archaeology 74.8
Mineral 57.7
Soil 57.6
Accessories 55.1
Accessory 55.1

Imagga
created on 2022-01-23

invertebrate 38.2
toast 25.8
animal 24.8
brown 23.5
food 23.2
breastplate 21.2
plate 18.5
close 18.2
armor plate 16.9
stone 16
old 15.3
delicious 14.9
slice 14.5
texture 13.9
meal 13.8
stole 13.6
organism 13.4
yellow 13.2
baked 13.1
bread 12.9
garment 12.8
natural 12.7
nutrition 12.6
eat 11.7
loaf 11.6
closeup 11.4
bakery 11.4
rough 10.9
decoration 10.9
scarf 10.7
mineral 10.7
dessert 10.6
rock 10.4
sweet 10.3
object 10.3
snack 10.2
tasty 10
insect 10
meat 9.9
breakfast 9.7
poncho 9.7
diet 9.7
detail 9.6
chocolate 9.6
clothing 9.5
healthy 9.4
fresh 9.1
black 9.1
tree 9
crust 8.7
ancient 8.6
piece 8.6
sugar 8.4
shield 8.4
pupa 8.3
mollusk 8.1
cut 8.1
light 8
conch 8
antique 7.9
arthropod 7.9
geology 7.8
cloak 7.8
baking 7.7
eating 7.6
art 7.5
decorative 7.5
color 7.2
dirty 7.2
kitchen 7.2
cuisine 7.1
textured 7

Google
created on 2022-01-23

Artifact 82.9
Font 72.1
Natural material 72
Metal 69.2
Stone tool 65.2
Rock 63.6
Bedrock 57.1
Mineral 53.4

Microsoft
created on 2022-01-23

mineral 63.3
rock 20

Color Analysis

Feature analysis

Amazon

Bread 100%

Captions

Microsoft

a close up of a rock 82.6%
close up of a rock 77.5%
a piece of wood 61%