Human Generated Data

Title

Utensil

Date

-

People

-

Classification

Tools and Equipment

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.229

Human Generated Data

Title

Utensil

Classification

Tools and Equipment

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.229

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Knife 97.2
Weaponry 97.2
Blade 97.2
Weapon 97.2
Hacksaw 74.8
Tool 74.8
Handsaw 74.8
Strap 63.5

Clarifai
created on 2019-07-07

no person 97.4
fashion 95.5
leather 93.4
wear 93.2
one 91.7
isolated 89.6
retro 89.3
sharp 85.8
desktop 84.5
steel 82.9
plastic 80.1
art 77.7
knife 76.6
vintage 75.5
paper 75.3
old 75
closeup 72.2
glazed 71.5
classic 69.1
industry 68.8

Imagga
created on 2019-07-07

scabbard 100
sheath 90.2
protective covering 66.9
blade 45.9
covering 44.1
knife 44
tool 41.8
knife blade 34.4
dagger 31.5
object 27.9
weapon 26.7
metal 23.3
cutting implement 22.9
steel 22.1
equipment 19.6
cut 16.1
nobody 15.5
handle 15.3
stainless 14.5
sharp 14.3
single 14
close 13.7
cutting 12.6
wood 12.5
silver 12.4
kitchen 11.6
work 11
closeup 10.8
utensil 10.7
shiny 10.3
food 10.3
iron 10.3
horizontal 10
fresh 9.8
old 9.8
detail 9.7
cooking 9.6
paper 9.4
metallic 9.2
danger 9.1
brown 8.8
wooden 8.8
preparation 8.6
business 8.5
cook 8.2
chef 7.8
space 7.8
construction 7.7
saw 7.6
tools 7.6
letter opener 7.6
diet 7.3
black 7.2
currency 7.2

Google
created on 2019-07-07

Machete 56.4

Microsoft
created on 2019-07-07

case 37.9
accessory 35.5

Color Analysis

Feature analysis

Amazon

Knife 97.2%

Captions

Microsoft
created on 2019-07-07

a close up of a knife 60%
a knife on a cutting board 28.3%