Human Generated Data

Title

Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3236

Human Generated Data

Title

Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3236

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Wood 94.8
Pottery 86.9
Archaeology 71.8
Hole 67.2
Pot 59.7
Teapot 58.2
Jug 56.3

Clarifai
created on 2023-10-26

ancient 97.2
no person 97.1
art 96.3
round out 95.8
retro 95.2
old 94.9
antique 94.8
round 94.4
sculpture 94.1
pottery 93.1
desktop 90.6
container 89.6
handmade 89.5
cutout 89.3
wood 88.8
decoration 88
nature 87.7
one 86.9
isolate 86.3
isolated 85

Imagga
created on 2022-01-23

ocarina 78.1
wind instrument 62.6
musical instrument 46.9
piggy bank 22.8
mask 22.8
container 20
close 18.8
money 17
savings bank 16.1
old 16
covering 15.9
brown 14.7
ancient 13.8
disguise 13.4
bank 12.5
pottery 11
cash 11
traditional 10.8
pot 10.6
coin 10.5
metal 10.5
shell 10.4
object 10.3
art 10.2
closeup 10.1
animal 10.1
vintage 9.9
wealth 9.9
currency 9.9
finance 9.3
attire 9.3
head 9.2
antique 9.2
banking 9.2
wood 9.2
gold 9
eye 8.9
clay 8.8
piggy 8.7
carving 8.6
culture 8.5
save 8.5
savings 8.4
economy 8.3
earthenware 8.2
history 8
financial 8
decoration 8
teapot 7.9
black 7.8
face 7.8
handmade 7.7
golden 7.7
investment 7.3
vessel 7.3
sculpture 7.1
wooden 7

Google
created on 2022-01-23

Natural material 87.3
Wood 82.3
Artifact 80.6
Trunk 74.7
Circle 66
Rectangle 63.4
Fossil 57.8
Metal 57.1
Tree 55.5
Rock 51.7

Microsoft
created on 2022-01-23

wall 98.1
indoor 96.8
music 81.1
stone 26.9
stoneware 18.8

Color Analysis

Categories

Imagga

pets animals 85.7%
paintings art 14.2%

Captions

Microsoft
created on 2022-01-23

a close up of a stuffed animal 34.8%