Human Generated Data

Title

Bichrome Jug / Oinochoe

Date

1100-700 BCE

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of David M. Robinson, 1960.268

Human Generated Data

Title

Bichrome Jug / Oinochoe

People
Date

1100-700 BCE

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Jug 97.9
Pottery 96.2
Jar 82.4
Vase 82.4
Rug 63

Imagga
created on 2022-01-29

vessel 53.7
earthenware 52.4
ceramic ware 47.2
vase 47.2
container 46.6
utensil 38.2
jar 31.5
pitcher 31.1
china 31
pot 23.6
jug 23
traditional 21.6
culture 21.3
decoration 21
object 19.8
porcelain 19.6
old 19.5
teapot 19.2
pottery 18.2
art 17.6
clay 17.6
drink 17.5
ancient 17.3
brown 16.9
antique 16.9
cup 15.3
majolica 14.8
ornate 14.6
craft 14.3
close 14.3
decorative 14.2
tea 14.1
black 13.8
ornament 13.8
circle 13.8
metal 12.9
ceramics 12.7
history 12.5
handle 12.4
design 12.4
gold 12.3
retro 12.3
ceramic 11.6
classical 11.5
bell 10.6
coffee 10.2
closeup 10.1
symbol 10.1
obsolete 9.6
vintage 9.1
single 9
kitchen 8.9
style 8.9
round 8.6
pattern 8.2
bottle 8.1
decor 8
yellow 7.9
acoustic device 7.9
terracotta 7.9
lid 7.8
color 7.8
past 7.7
ethnicity 7.7
god 7.7
texture 7.6
old fashioned 7.6
oriental 7.5
east 7.5
shape 7.5
water 7.3
refreshment 7.2
beverage 7.2
morning 7.2

Google
created on 2022-01-29

Dishware 93.3
Tableware 92.7
Serveware 86.6
Drinkware 85.2
Pottery 82.7
Artifact 82.5
earthenware 79.1
Porcelain 77.8
Wood 77.3
Art 76.6
Ceramic 73.6
Circle 73.2
Metal 69.1
Vase 65.9
Antique 65.5
Event 62.4
Collectable 56.8
Home accessories 51.5
Still life 50.3

Microsoft
created on 2022-01-29

vase 78.5
gold 69.3
ceramic ware 68.4
bronze 66.1
earthenware 61.9
vessel 59.4
art 57.7
pottery 53.4
stoneware 34.6
porcelain 13.8

Feature analysis

Amazon

Rug 63%

Captions

Microsoft

a close up of a cup 27%
a close up of an animal 26.9%
a close up of a statue 26.8%