Human Generated Data

Title

Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Rt. Rev. Robert M. Hatch, 2000.187

Human Generated Data

Title

Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Rt. Rev. Robert M. Hatch, 2000.187

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Plant 96.8
Gourd 94.1
Vegetable 94.1
Food 94.1
Produce 94.1
Egg 89.6
Archaeology 79.5
Soil 66.2
Hole 60.8
Grain 58.5
Seed 58.5
Bronze 55.3

Clarifai
created on 2023-10-25

no person 99.8
pottery 98
one 96.6
sculpture 96.5
nature 96.5
clay 95.7
round out 95
art 94.6
ancient 94.3
simplicity 93.7
retro 91.9
arts and crafts 91.8
heavy 90.3
prehistoric 88.8
old 88.5
iron 88.4
wood 87.8
handmade 86.4
rusty 85.6
still life 84.4

Imagga
created on 2022-01-14

ocarina 100
wind instrument 100
musical instrument 93.5
food 41.9
bread 35.2
brown 33.1
loaf 29.1
healthy 26.4
breakfast 24.7
baked 23.4
snack 23.1
meal 22.7
bakery 21.9
organic 21.8
crust 21.3
fresh 19.6
close 19.4
eating 19.3
nutrition 19.3
wheat 18.1
tasty 17.5
grain 17.5
natural 17.4
yam 17.1
delicious 16.5
cereal 16.4
whole 16.2
pastry 16.1
diet 14.5
studio 14.4
dinner 13.5
slice 12.7
sweet 12.6
eat 12.6
closeup 12.1
lunch 12
vegetable 11.9
gourmet 11.9
object 11.7
kitchen 11.6
stone 11
crusty 10.8
carbohydrates 10.8
rye 10.8
flour 10.7
bake 10.6
health 10.4
seed 10.3
freshness 10
root vegetable 9.9
cut 9.9
bun 9.7
ingredient 9.7
homemade 9.6
black 9.6
rock 9.6
nutritious 9.5
nobody 9.3
pebble 8.8
sweet potato 8.8
cooking 8.7
fruit 8.7
cake 8.5
fat 8.4
stack 8.3
meat 8.1
spa 8.1
round 7.8
therapy 7.5
gray 7.2
dessert 7.1
produce 7

Google
created on 2022-01-14

Artifact 82.8
Wood 73.1
Metal 65
Art 64
Rock 61.1
Circle 59.4

Color Analysis

Feature analysis

Amazon

Egg 89.6%

Categories

Imagga

food drinks 93.5%
pets animals 5.7%

Captions

Microsoft
created on 2022-01-14

a close up of a doughnut 27.2%
a close up of a mans face 27.1%