Human Generated Data

Title

Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3227

Human Generated Data

Title

Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3227

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bronze 99.8
Pottery 80.5

Clarifai
created on 2023-10-26

sculpture 99.6
pottery 99.2
ancient 99.1
art 98.9
no person 98.9
clay 98.1
old 97.5
retro 96.9
one 96.3
armor 96.1
antique 95.8
museum 95.2
copper 95.1
handmade 94.4
veil 93.9
wear 93
cutout 91.7
metalwork 90.8
container 89.7
bronze 89.7

Imagga
created on 2022-01-23

ocarina 100
wind instrument 100
musical instrument 96.2
earthenware 22.5
old 22.3
brown 21.3
money 21.3
object 18.3
bank 17.9
metal 17.7
currency 17
finance 16.9
wood 16.7
cash 16.5
ceramic ware 15.7
food 15.1
wooden 14.9
antique 14.7
coin 14.3
close 14.3
vintage 14.1
retro 13.9
ancient 13.8
coins 13.5
decoration 13.1
closeup 12.8
banking 11.9
business 11.5
handmade 10.7
savings 10.3
utensil 10.2
traditional 10
wealth 9.9
save 9.5
economy 9.3
gold 9
financial 8.9
container 8.6
pay 8.6
round 8.6
snack 8.5
historical 8.5
rich 8.4
investment 8.2
fruit 8.2
golden 7.7
treasure 7.7
piggy 7.7
culture 7.7
decorated 7.7
decorative 7.5
art 7.4
symbol 7.4
fresh 7.2
history 7.2
shiny 7.1
sweet 7.1
animal 7.1

Google
created on 2022-01-23

Creative arts 83.5
Artifact 80.4
Art 79.3
Natural material 79.2
Wood 75.7
Metal 56.4
Carving 56.1
Rock 53.7

Microsoft
created on 2022-01-23

wall 98.8
indoor 85.8
bronze 76.6
plant 47.1
stoneware 17.6

Color Analysis

Categories

Imagga

paintings art 98.9%

Captions

Microsoft
created on 2022-01-23

a close up of a bowl 43%