Human Generated Data

Title

Pitcher

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, G. G. Van Rensselaer Fund, 1977.216.3388

Human Generated Data

Title

Pitcher

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Vase 97.8
Jar 97.8
Pottery 97.8
Milk 96.9
Beverage 96.9
Drink 96.9
Urn 67.7
Jug 55.1

Imagga
created on 2022-01-29

jar 100
vase 100
vessel 100
container 93.2
traditional 29.9
decoration 26.1
pot 24.1
object 22
earthenware 21.4
culture 21.4
glass 21
decorative 20.9
old 20.2
pottery 19.6
art 19.5
brown 18.4
jug 18.1
ornament 18.1
ancient 17.3
handle 17.2
craft 17.2
china 16.9
drink 16.7
ceramic 16.5
antique 16.4
pitcher 16.3
clay 15.6
single 15.6
ceramics 14.7
classical 14.3
ornate 13.7
history 13.4
liquid 13
style 12.6
vintage 12.4
gold 12.3
metal 12.1
golden 12
water 12
close 12
ceramic ware 11.9
lid 11.8
tea 11.3
yellow 11.3
porcelain 11.2
food 10.9
teapot 10.8
closeup 10.8
transparent 10.7
utensil 10.7
retro 10.7
color 10.6
oriental 10.4
cup 10.2
terracotta 9.9
texture 9.7
celebration 9.6
tradition 9.2
black 9
kitchen 8.9
urn 8.9
souvenir 8.8
shiny 8.7
old fashioned 8.6
flower 8.5
design 8.4
relaxation 8.4
pattern 8.2
natural 8
ball 7.9
obsolete 7.7
ornamental 7.6
healthy 7.6
one 7.5
sphere 7.4
holiday 7.2
leaf 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

sitting 96.5
plant 95.1
ceramic 92.6
vase 92.3
indoor 88.5
black 79.6
ceramic ware 69.3
vessel 69.2
earthenware 67.4
pottery 66.8
close 34.8
jar 34.3
displayed 33.9
base 20.1
stoneware 15.8
porcelain 9.1

Feature analysis

Amazon

Milk 96.9%

Captions

Microsoft

a close up of a vase 72.8%
a vase sitting on a wooden surface 51.7%
a close up of a vase on a table 51.6%