Human Generated Data

Title

Strainer

Date

n.d.

People

-

Classification

Tools and Equipment

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.1988.A-B

Human Generated Data

Title

Strainer

Date

n.d.

Classification

Tools and Equipment

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, 1977.216.1988.A-B

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Bronze 100
Mace Club 94.2
Weapon 94.2
Bell 84.4

Clarifai
created on 2018-05-09

no person 98.6
old 95
vintage 93.9
antique 93.8
kitchenware 93.6
one 91
tool 89.8
rusty 88.6
desktop 87.9
retro 86.9
handle 86.1
isolated 85.4
wood 84.7
cutout 82.3
utensil 80.8
invertebrate 79
closeup 78.5
iron 78.3
lid 77.9
steel 77.4

Imagga
created on 2023-10-07

strainer 100
filter 100
metal 30.6
close 24
object 20.5
kitchen 19.7
food 19.3
black 18.6
closeup 17.5
pan 17.2
cooking 16.6
handle 16.2
frying 14.8
equipment 14.8
ladle 13.8
tool 13.6
brown 13.2
steel 13.2
utensil 12.9
breakfast 12.4
single 12.3
spoon 11.8
fry 11.7
drink 11.7
silver 11.5
device 11.4
kitchenware 11.2
container 11.1
hot 10.9
vessel 10.8
healthy 10.7
meal 10.5
empty 10.3
instrument 10.3
cook 10.1
music 9.9
studio 9.9
key 9.8
accessory 9.5
sound 9.4
iron 9.3
eat 9.2
wood 9.2
cup 9
home 8.8
lunch 8.6
flavor 8.6
nobody 8.5
dinner 8.4
heat 8.3
traditional 8.3
metallic 8.3
diet 8.1
detail 8
wooden 7.9
objects 7.8
old 7.7
audio 7.6
tools 7.6
tea 7.5

Google
created on 2018-05-09

Microsoft
created on 2023-10-30

weapon 67.4

Color Analysis

Feature analysis

Amazon

Mace Club 94.2%

Categories

Imagga

nature landscape 51.5%
food drinks 44.5%
pets animals 1.9%

Captions