Human Generated Data

Title

Lamp

Date

People
Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.3087.A

Human Generated Data

Title

Lamp

People
Date

Classification

Lighting Devices

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Fish 96.4
Animal 96.4
Bronze 83.5
Sea Life 63.2
Conch 60
Invertebrate 60
Seashell 60
Apparel 56.6
Clothing 56.6
Shoe 56.6
Clogs 56.6
Footwear 56.6

Imagga
created on 2022-01-23

ocarina 91.7
wind instrument 75
footwear 62.9
musical instrument 56.2
shoe 51.2
leather 49
shoes 48.9
pair 45.3
foot 43.8
wear 37.4
boot 35.8
object 35.2
boots 34.1
clothing 34
lace 34
fashion 33.9
shiny 31.6
heels 30.3
classic 29.7
foot gear 29.6
feet 29
men 27.5
rubber 26.9
heel 26.5
male 24.8
close 24.5
brown 24.3
objects 21.7
casual 19.5
clog 18.5
black 17.4
sole 15.7
shell 12.8
two 12.7
orange 12.3
style 11.9
covering 11.7
elegance 10.9
new 10.5
closeup 10.1
old 9.7
variety 9.7
device 9.7
metal 9.6
accessory 9.5
plug 8.8
man 8.7
work 8.6
formal 8.6
modern 8.4
elegant 7.7
tool 7.5
single 7.4
yellow 7.3
detail 7.2
design 7

Google
created on 2022-01-23

Musical instrument 91.2
Natural material 81.1
Wood 80.1
Artifact 75.1
Art 71.4
Toy 57.5
Metal 54.4
Fish 51.8
Creative arts 50.3
Conch 50.2

Microsoft
created on 2022-01-23

animal 88.9
ceramic 59.5
artifact 56.2
sculpture 50.3
ceramic ware 34.4
stoneware 18.4

Feature analysis

Amazon

Fish 96.4%

Captions

Microsoft

a close up of an animal 67.6%
close up of an animal 61.6%
an animal with its mouth open 57.6%