Human Generated Data

Title

Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.172

Human Generated Data

Title

Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.172

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bread 94.7
Food 94.7
Pottery 87.7
Archaeology 66
Pot 64.6
Jug 63.2
Teapot 55.1

Clarifai
created on 2023-10-26

pottery 99.8
no person 98.7
clay 98.4
art 98.3
old 96.4
handmade 96.2
antique 95.7
still life 95.7
ancient 94
cutout 93.8
sculpture 93.6
earthenware 93.1
nature 92.6
one 92.4
ceramic 92.3
retro 89.7
simplicity 88.9
container 87.8
arts and crafts 87
traditional 85.6

Imagga
created on 2022-01-23

teapot 100
pot 95.7
vessel 69.7
cooking utensil 68.7
container 48
snail 46.9
kitchen utensil 45.7
mollusk 42.9
gastropod 40.4
shell 34
slow 28.3
animal 25.6
utensil 24.9
invertebrate 24.5
brown 23.5
close 21.1
garden 17.6
food 15.7
spiral 15.2
slime 14.8
closeup 14.8
slug 14.8
speed 13.7
fungus 12.9
slimy 12.8
clay 11.7
traditional 10.8
tea 10.5
old 10.4
fruit 9.5
decoration 9.4
china 9.3
eat 9.2
black 9
healthy 8.8
shiny 8.7
natural 8.7
gold 8.2
retro 8.2
pottery 8.2
acorn 8
crawling 7.9
nobody 7.8
eating 7.6
organic 7.6
wood 7.5
art 7.3
organism 7.3
tree 7.1
leaf 7

Google
created on 2022-01-23

Dishware 87.4
Serveware 86.4
Sculpture 85.4
Artifact 83.3
Pottery 83.2
Tableware 82.8
Art 82.1
earthenware 78.6
Wood 76.9
Teapot 75.1
Snout 74.7
Ceramic 70.3
Toy 65.4
Carving 65.2
Creative arts 64.1
Clay 62.5
Ancient history 61.5
Snail 60.3
History 59.6
Rock 59.2

Color Analysis

Feature analysis

Amazon

Bread
Bread 94.7%

Categories

Captions

Microsoft
created on 2022-01-23

a close up of an animal 62.7%