Human Generated Data

Title

Lamp

Date

-

People

-

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of W.C. Burriss Young, 2002.60.7

Human Generated Data

Title

Lamp

Classification

Lighting Devices

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of W.C. Burriss Young, 2002.60.7

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Wood 100
Bread 95.7
Food 95.7
Plywood 83.4
Driftwood 69.7
Figurine 67.5
Archaeology 62.7
Fish 60.8
Animal 60.8

Clarifai
created on 2019-07-06

no person 98.9
one 94
food 92.4
still life 92.2
nature 90.2
stone 82.3
desktop 76.7
rock 75.7
winter 75.3
sculpture 74.8
wood 74.6
travel 73.6
simplicity 72.3
little 70.3
medicine 69.9
snow 69.1
old 68.3
grow 67.8
cutout 65.7
isolated 65.6

Imagga
created on 2019-07-06

ocarina 100
wind instrument 87.3
musical instrument 65.4
food 33.8
plug 32.3
brown 24.3
snack 22.2
sweet 19.7
tasty 16.7
chocolate 15.9
eat 15.9
cookies 15.6
cookie 14.6
bakery 14.3
dessert 14.1
nobody 14
yellow 13.9
healthy 13.8
eating 13.4
natural 13.4
delicious 13.2
close 13.1
baked 13.1
meal 12.2
diet 12.1
bread 12
gourmet 11.9
biscuit 11.7
treat 11.4
pastry 11.4
tee 11.3
object 11
nutrition 10.9
closeup 10.8
baking 10.6
sugar 10.3
animal 10.3
dry 10.2
organic 10.1
chip 9.9
bake 9.6
lunch 9.4
stack 9.2
golf equipment 9.1
studio 9.1
peg 9.1
old 9
breakfast 8.8
equipment 8.8
fresh 8.5
two 8.5
color 8.3
health 8.3
ingredient 7.9
cooking 7.9
clog 7.6
horizontal 7.5
cook 7.3
group 7.2
detail 7.2
fastener 7.1
wooden 7

Google
created on 2019-07-06

Artifact 70.3
Jaw 69.6
Beige 66.6
Wood 59.6
Tooth 54.8
Rock 54.2
Fossil 51.9

Color Analysis

Feature analysis

Amazon

Bread 95.7%
Fish 60.8%

Categories

Imagga

pets animals 99.5%

Captions

Microsoft
created on 2019-07-06

a piece of wood 60.3%
a close up of a knife 31.1%
a piece of paper 31%