Human Generated Data

Title

Pitcher

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, G. G. Van Rensselaer Fund, 1977.216.3394

Human Generated Data

Title

Pitcher

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Jug 96.6
Pottery 92.7
Jar 69
Vase 62.3
Person 61
Human 61
Pot 57.5

Imagga
created on 2022-01-29

china 64.1
porcelain 47.4
ceramic ware 46.3
vase 43.1
container 38
utensil 38
vessel 38
pot 29.7
earthenware 27.3
traditional 25.8
pitcher 25
decoration 23.3
pottery 23
jar 22.5
teapot 21.6
drink 20.9
art 20.8
object 20.5
brown 19.9
decorative 18.4
handle 18.1
tea 18
culture 17.9
clay 17.6
craft 17.2
cup 17
old 16
ceramics 15.6
ornate 15.5
ornament 15.5
jug 15.3
food 15.1
antique 15
single 14.8
ceramic 14.5
close 13.7
wicker 13.5
kitchen 13.4
gold 13.1
ancient 13
glass 10.9
retro 10.6
objects 10.4
product 10.4
luxury 10.3
color 10
water 10
liquid 9.6
old fashioned 9.5
golden 9.5
healthy 9.4
closeup 9.4
work 9.2
flower 9.2
beverage 9.2
design 9
history 8.9
pattern 8.9
decor 8.8
lid 8.8
restaurant 8.6
yellow 8.6
classical 8.6
holiday 8.6
black 8.4
east 8.4
texture 8.3
one 8.2
style 8.2
domestic 8.1
souvenir 7.8
handmade 7.7
decorated 7.7
herbal 7.6
oriental 7.6
vintage 7.4
coffee 7.4
fruit 7.3
creation 7.3
metal 7.2
celebration 7.2
colorful 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

art 61.4
ceramic ware 59.3
gold 52.9
vessel 50.3
porcelain 15.9
stoneware 15.8

Feature analysis

Amazon

Person 61%

Captions

Microsoft

a close up of a statue 75.7%
a statue of a vase 44.7%
close up of a statue 44.6%