Human Generated Data

Title

Lamp

Date

People
Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Museum Collection, 1951.107.2

Human Generated Data

Title

Lamp

People
Date

Classification

Lighting Devices

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Reptile 97.4
Snake 97.4
Animal 97.4
Pottery 92.1
Bronze 91.2
Home Decor 61.1
Jar 55.8

Imagga
created on 2022-01-23

cannon 61.1
gun 46.6
weapon 33.9
old 26.5
metal 24.9
ocarina 21.2
vessel 20.5
mortar 19.8
device 18.6
brown 18.4
tool 18.4
rusty 18.1
container 18
wind instrument 18
steel 16.8
wood 15
musical instrument 13.5
drink 13.4
vintage 13.2
object 13.2
close 12.5
pot 12.5
nobody 12.4
antique 12
cup 11.7
wooden 11.4
iron 11.2
coffee 11.1
bark 11
food 10.9
closeup 10.8
breakfast 10.6
rust 10.6
equipment 10.5
fastener 10.2
construction 9.4
traditional 9.1
dirty 9
vase 8.7
ancient 8.6
work 8.6
wrench 8.6
culture 8.5
jug 8.5
aroma 8.4
black 8.4
artillery 8.4
hammer 8.3
spice 8.1
ceramic 7.7
restraint 7.7
health 7.6
handle 7.6
break 7.6
dry 7.4
metallic 7.4
natural 7.4
detail 7.2
aged 7.2
beverage 7.2
kitchen 7.1
mechanical device 7.1
instrument 7

Google
created on 2022-01-23

Wood 83
Artifact 80.7
Art 76.5
earthenware 66.9
Metal 61.3
Sculpture 59.5
Rock 56.7
Circle 55.4
Clay 54.8
Font 53
Brick 50.3

Microsoft
created on 2022-01-23

stone 4.7

Feature analysis

Amazon

Snake 97.4%

Captions

Microsoft

a hole in the ground 70%
a close up of a hole in the ground 69.9%
close up of a hole in the ground 68.1%