Human Generated Data

Title

Ceramic Vessel

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.137

Human Generated Data

Title

Ceramic Vessel

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Pottery 90.8
Jar 84.6
Clothing 83.9
Footwear 83.9
Apparel 83.9
Shoe 83.9
Sock 83.9
Vase 74.8

Imagga
created on 2022-01-29

vase 100
jar 100
vessel 100
container 78.4
food 19.9
brown 19.9
drink 19.2
glass 17.1
bread 14.8
object 14.6
slice 12.7
china 12.7
close 12.5
fresh 12.4
delicious 12.4
loaf 11.6
baked 11.2
ancient 11.2
paper 11
pot 10.7
breakfast 10.6
diet 10.5
ceramic ware 10.4
black 10.2
alcohol 10.2
cup 10.1
nobody 10.1
traditional 10
bubbles 9.5
bubble 9.4
snack 9.4
refreshment 9.1
old 9
beverage 9
porcelain 9
meal 8.9
jug 8.8
crust 8.7
liquid 8.7
natural 8.7
cold 8.6
chocolate 8.4
healthy 8.2
sweet 7.9
rye 7.8
clay 7.8
antique 7.8
cereal 7.7
culture 7.7
wheat 7.6
tasty 7.5
closeup 7.4
earthenware 7.4
grain 7.4
dessert 7

Google
created on 2022-01-29

Artifact 84.3
Art 82.8
Fashion accessory 67.3
Font 67.2
Human leg 67.1
Rock 66.5
Wood 65.4
Stone tool 63.4
Ceramic 63.3
Illustration 62.4
Metal 58.3
Natural material 56.4
Pattern 54.6
Visual arts 53.3
Piano 51.9
Thigh 51.2

Microsoft
created on 2022-01-29

indoor 92.5
ceramic ware 85.1
jar 83
vessel 77.7
artifact 68.6
plant 62.3
porcelain 16.9
stoneware 16.7

Feature analysis

Amazon

Sock 83.9%

Captions

Microsoft

a vase sitting on a table 47.9%
a vase sitting on top of a table 44.3%
a vase sitting on top of each other 36.4%