Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Fragmentary Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.2857

Human Generated Data

Title

Fragmentary Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.2857

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bronze 80.7
Hole 57.1
Weapon 56.1
Weaponry 56.1

Clarifai
created on 2023-10-27

no person 99.7
pottery 99.6
clay 98.2
one 96.9
still life 96.2
old 96.2
handmade 94.3
art 94.1
antique 93.4
earthenware 92.9
simplicity 92.2
ceramic 92.1
ancient 91.3
nature 91.3
retro 90.6
container 89.4
sculpture 89.3
arts and crafts 88.5
heavy 87.3
traditional 87.1

Imagga
created on 2022-01-23

seal 45.2
fastener 44.4
restraint 32.6
device 28.1
close 19.4
food 16.1
old 15.3
closeup 14.8
brown 14
cup 13.5
breakfast 13.2
gun 12.9
metal 12.9
pot 12.8
sweet 12.6
cannon 12.3
tea 12
black 12
teapot 11.8
health 11.8
object 11.7
drink 11.7
paper 11.6
cotton 11.6
sugar 11.4
tissue 11
vessel 10.6
container 10.5
roll 10.4
color 10
friedcake 9.8
machine 9.5
chocolate 9.4
aroma 9.4
eye 8.9
dessert 8.8
nobody 8.5
hygiene 8.5
eat 8.4
hot 8.4
coffee 8.3
gear 8.3
hole 8.2
healthy 8.2
diet 8.1
light 8
toilet 7.9
cleanse 7.9
fresh 7.8
ceramics 7.8
ancient 7.8
round 7.8
snack 7.7
weapon 7.6
pastry 7.6
cake 7.5
clean 7.5
detail 7.2
beverage 7.2
circle 7

Google
created on 2022-01-23

Artifact 84.4
Art 79.9
Gas 70.4
Metal 69.3
earthenware 67.3
Circle 64.4
Fashion accessory 62.4
Rock 61.2
Creative arts 60.2
Clay 58.5
Pottery 57.8
Ceramic 54.6

Microsoft
created on 2022-01-23

donut 41.9
building material 28.5
stoneware 15.9
stone 13.5

Color Analysis

Categories

Imagga

food drinks 78.7%
paintings art 16.9%
interior objects 2.9%

Captions

Microsoft
created on 2022-01-23

a close up of a doughnut 42%
close up of a doughnut 37.4%
a close up of a donut 35.3%

Text analysis

Amazon

2857