Human Generated Data

Title

Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3215

Human Generated Data

Title

Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Oric Bates, 1977.216.3215

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bread 84.2
Food 84.2
Figurine 77.4
Bronze 72.1

Clarifai
created on 2023-10-26

pottery 99.8
art 99.3
no person 99
old 98.6
clay 98.2
ancient 98.1
retro 98.1
ceramic 97.2
container 96.9
antique 96.6
one 95.8
cutout 94.9
handmade 94.6
traditional 94.1
decoration 93.1
earthenware 88.5
sculpture 87.3
bird 87.2
vintage 86.9
museum 85.6

Imagga
created on 2022-01-23

ocarina 100
wind instrument 100
musical instrument 100
piggy 26.9
bank 25.1
money 20.4
pig 20.2
piggy bank 19.7
finance 17.7
save 17.1
savings 16.8
animal 16.5
investment 15.6
close 15.4
banking 14.7
coin 14.3
saving 13.5
currency 13.4
cash 12.8
wealth 12.6
brown 12.5
object 11.7
dollar 11.1
economy 11.1
business 10.9
financial 10.7
decoration 10.1
symbol 10.1
container 9.9
rabbit 9.7
ceramic 9.7
pink 9.2
traditional 9.1
gold 9
color 8.9
retirement 8.6
old 8.4
toy 8.3
mammal 8.1
cute 7.9
food 7.8
bunny 7.8
clay 7.8
account 7.7
invest 7.7
coins 7.7
debt 7.7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

Color Analysis

Feature analysis

Amazon

Bread
Bread 84.2%

Categories

Imagga

paintings art 97%
food drinks 1.8%
pets animals 1.2%

Captions

Microsoft
created on 2022-01-23

a close up of a stone wall 49.2%