Human Generated Data

Title

Lamp

Date

People
Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Gift of Pfeiffer-Hartwell Collection, 1977.216.359

Human Generated Data

Title

Lamp

People
Date

Classification

Lighting Devices

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Snake 99.2
Animal 99.2
Reptile 99.2
Smoke Pipe 94.8

Imagga
created on 2022-01-23

ocarina 67.1
cannon 58.9
wind instrument 58.5
gun 48.6
musical instrument 43.7
weapon 35.6
whistle 25.9
device 24.4
metal 23.3
close 18.2
old 17.4
signaling device 17.4
acoustic device 17.2
brown 16.2
cup 13.5
steel 13.2
pot 13
drink 12.5
vintage 12.4
closeup 12.1
food 12.1
tool 11.8
antique 11.3
instrument 10.7
plug 10.3
coffee 10.2
metallic 10.1
wood 10
traditional 10
breakfast 9.7
nut 9.7
spice 9.6
beverage 9
health 9
equipment 9
kitchen 8.9
ancient 8.6
rusty 8.6
aroma 8.4
black 8.4
object 8.1
natural 8
light 8
wooden 7.9
sweet 7.9
nobody 7.8
caffeine 7.7
hardware 7.7
break 7.6
roll 7.6
hot 7.5
container 7.4
bark 7.3
aged 7.2
gray 7.2
history 7.1
dessert 7.1

Google
created on 2022-01-23

Pottery 80.9
Artifact 80
earthenware 78.3
Art 75.9
Wood 71.7
Ceramic 69.8
Metal 61.9
Vase 61.6
Creative arts 60.2
Circle 55.8
Fashion accessory 54.1
Terrestrial animal 53.8
Clay 52.7
Rock 51.1

Microsoft
created on 2022-01-23

indoor 85.7
ceramic 82.3
tool 31.7

Feature analysis

Amazon

Snake 99.2%

Captions

Microsoft

a close up of a statue 61.3%
close up of a statue 56.4%