Human Generated Data

Title

VASE IN FORM OF SHOE. (BLACK TERRA COTTA)

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Mr. and Mrs. Lyman W. Gale, 1935.43

Human Generated Data

Title

VASE IN FORM OF SHOE. (BLACK TERRA COTTA)

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Mr. and Mrs. Lyman W. Gale, 1935.43

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Smoke Pipe 94.3
Bronze 87.5
Plant 78
Clothing 65
Apparel 65
Pottery 64.2
Art 56.4
Sculpture 55.4

Clarifai
created on 2023-10-29

monochrome 99.7
no person 99.1
one 99.1
still life 99
people 98.8
two 98.1
black and white 95.5
mono 94.3
fashion 93.8
shining 93.5
nature 92.8
woman 92.4
food 92.4
cutout 91.5
wear 90.8
art 90.4
leather 88.4
hot 87.6
sculpture 87.2
contrast 86.3

Imagga
created on 2022-06-11

footwear 96.1
clog 70.4
shoe 50
covering 46.9
shoes 36.5
leather 36.1
pair 33
foot 31.4
ocarina 28.1
fashion 27.9
clothing 27.8
lace 25.6
boots 25.4
black 25.2
shiny 24.5
wear 23
boot 22.9
wind instrument 22.6
stone 21.9
rubber 19.2
object 19.1
heels 18.6
classic 18.6
male 18.4
brown 17.7
men 17.2
musical instrument 16.9
heel 16.7
rock 16.5
feet 16.4
spa 16.1
loafer 16
pebble 15.8
two 15.2
arctic 14.8
foot gear 14.8
sole 13.8
casual 13.6
device 13.3
balance 13.3
close 12.6
stones 12.3
therapy 12.2
harmony 12.2
style 11.9
health 11.8
walking 11.4
laces 10.8
accessory 10.5
objects 10.4
elegance 10.1
new 9.7
meditation 9.6
rocks 9.4
relax 9.3
relaxation 9.2
wellness 9.1
food 9.1
old 9.1
reflection 8.9
used 8.6
stack 8.3
healthy 8.2
man 8.1
natural 8
ingredient 7.9
business 7.9
luxury 7.7
culture 7.7
formal 7.6
shoehorn 7.6
modern 7

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

Color Analysis

Feature analysis

Amazon

Smoke Pipe 94.3%

Categories

Captions

Microsoft
created on 2022-06-11

a close up of a knife 37.6%
a bowl of fruit 27.8%
a close up of a fruit 27.7%