Human Generated Data

Title

Stamped Amphora Handle

Date

c. 150 CE

People
Classification

Vessels

Human Generated Data

Title

Stamped Amphora Handle

People
Date

c. 150 CE

Classification

Vessels

Machine Generated Data

Tags

Amazon

Soil 91.8
Archaeology 87.3
Animal 68.9
Turtle 68.9
Reptile 68.9
Sea Life 68.9
Fungus 62.8
Pottery 60.4
Figurine 57.6

Clarifai

no person 99.9
one 97.9
still life 93.5
food 93.2
two 91.8
nature 85.7
mammal 84.7
grow 83.5
winter 78.6
three 77.5
prehistoric 76.2
sculpture 75.8
side view 73.4
travel 71.3
invertebrate 71.3
people 69.2
snow 68.5
cutout 68.1
wildlife 66.6
wood 63.8

Imagga

yam 33.7
food 31.9
invertebrate 29.4
gastropod 27.1
mollusk 23.7
root vegetable 23.1
brown 22.8
sweet potato 21.5
conch 19.6
meal 18.6
vegetable 18.1
arthropod 17.1
meat 16.2
delicious 14.8
close 14.8
dinner 13.5
snack 12.8
insect 12.7
turtle 12.6
lunch 12
healthy 12
gourmet 11.9
produce 11.8
horseshoe crab 11.7
eat 11.7
tasty 11.7
sea turtle 11.4
electric ray 11.2
fresh 11.1
bread 11.1
cuisine 10.6
yellow 10.6
nobody 10.1
fungus 10.1
eating 10.1
nutrition 10.1
diet 9.7
animal 9.5
closeup 9.4
baked 9.4
water 9.3
ray 9
snail 8.9
sea 8.8
beach 8.6
leaf 8.6
pastry 8.5
travel 8.4
ocean 8.4
shovel 8.3
single 8.2
slice 8.2
shell 8
plant 7.9
sweet 7.9
sand 7.8
slow 7.8
life 7.8
fish 7.7
roast 7.7
grilled 7.6
plate 7.6
beef 7.6
sugar 7.5
fat 7.4
dry 7.4
dessert 7
pupa 7

Google

Rock 73.4
Artifact 65.1

Microsoft

eaten 32.5

Feature analysis

Amazon

Turtle 68.9%
Fungus 62.8%

Captions

Microsoft

a piece of food 48.8%
a piece of meat 48.7%
a piece of meat on a white surface 40.8%