Human Generated Data

Title

Vase

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Misses Norton, 1920.44.5

Human Generated Data

Title

Vase

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Misses Norton, 1920.44.5

Machine Generated Data

Tags

Amazon
created on 2022-01-14

Jar 98.8
Pottery 98.3
Vase 95.7
Urn 94.4
Milk 93.8
Beverage 93.8
Drink 93.8

Clarifai
created on 2023-10-26

no person 99.7
pottery 99.1
art 97.7
marble 97.3
one 96
handmade 95.5
ancient 95.3
clay 95.2
antique 95.1
arts and crafts 94.4
retro 93.9
artistic 92.3
empty 90.3
sculpture 89.2
nature 89
old 88.7
simplicity 87.3
traditional 85.1
dirty 84.8
interior design 81.7

Imagga
created on 2022-01-14

vase 100
jar 100
vessel 82.3
container 65.2
food 25.1
bottle 24.1
ceramic ware 23.3
earthenware 21.4
utensil 18.4
jug 17.6
porcelain 16.7
glass 16.3
brown 16.2
object 16.1
china 15.8
pottery 15.3
fresh 15
traditional 15
decoration 14.5
healthy 13.8
closeup 12.8
pot 12.5
antique 11.9
decorative 11.7
close 11.4
natural 11.4
fruit 11.1
drink 10.8
ingredient 10.5
craft 10.5
ancient 10.4
art 10.2
nobody 10.1
pitcher 9.8
clay 9.8
old 9.7
health 9.7
liquid 9.5
culture 9.4
freshness 9.1
ornament 8.6
yellow 8.6
organic 8.4
clean 8.3
vintage 8.3
meal 8.1
diet 8.1
history 8
kitchen 8
medicine 7.9
terracotta 7.9
ceramics 7.8
dinner 7.6
delicious 7.4
single 7.4
raw 7.1
sweet 7.1

Google
created on 2022-01-14

Microsoft
created on 2022-01-14

vessel 89.9
jar 86.1
vase 71.9
ivory 67.3
artifact 54.3
ceramic ware 35.6

Color Analysis

Feature analysis

Amazon

Milk 93.8%

Categories

Imagga

food drinks 100%

Captions

Microsoft
created on 2022-01-14

a vase sitting on a table 36.2%
a vase sitting on top of a table 33.5%
a vase on a table 33.4%