Human Generated Data

Title

Lamp

Date

People
Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.175

Human Generated Data

Title

Lamp

People
Date

Classification

Lighting Devices

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Bronze 93.8
Smoke Pipe 91.5
Machine 82.7
Brick 72.6
Rust 57.9
Propeller 57.6
Strap 56.9

Imagga
created on 2022-01-23

ocarina 41.6
wind instrument 41.1
musical instrument 31.9
plug 31.6
pot 24.1
brown 23.5
teapot 22
tool 17.8
close 17.7
wood 17.5
whistle 17.5
old 17.4
metal 16.9
wooden 15.8
hammer 15.6
device 15.5
vessel 14.9
steel 14.1
food 13.4
equipment 13.3
nobody 13.2
container 13
clay 12.7
law 12.6
gavel 12.5
objects 12.2
cup 11.9
object 11.7
court 11.7
wrench 11.3
earthenware 11.1
construction 10.3
utensil 10.2
black 10.2
closeup 10.1
drink 10
cooking utensil 9.7
tools 9.5
work 9.4
tea 9.4
machine 9.4
traditional 9.1
pottery 8.8
spanner 8.8
spice 8.8
nut 8.8
industry 8.5
kitchen 8
auction 7.9
judge 7.9
justice 7.9
culture 7.7
hardware 7.7
repair 7.7
handle 7.6
candy 7.5
iron 7.5
aged 7.2
pulley 7.2
gray 7.2
shiny 7.1

Google
created on 2022-01-23

Wood 82.2
Artifact 79.9
Pottery 77.6
earthenware 75.8
Creative arts 75.1
Gas 70.2
Art 68.3
Metal 66.9
Clay 66
Door 61.6
Auto part 60.4
Circle 57.9
Ceramic 55.4
Sculpture 54.3
Cylinder 51.6
Household hardware 50.8

Microsoft
created on 2022-01-23

indoor 96.2
doughnut 94.1
tan 71.4
chocolate 67.8
plant 46.1

Feature analysis

Amazon

Smoke Pipe 91.5%

Captions

Microsoft

a close up of a doughnut 29.5%
a round brown object on a surface 29.4%
close up of a doughnut 27.2%