Human Generated Data

Title

Amphora

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.147

Human Generated Data

Title

Amphora

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Peabody Museum of Archaeology and Ethnology, Harvard University, 1978.495.147

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Jug 100
Cup 99.2
Water Jug 95.3
Pottery 71.5
Jar 57.7

Clarifai
created on 2023-11-01

jug 99.9
carafe 99.8
pottery 99.7
no person 99.5
clay 99.3
empty 98.9
jar 98.7
retro 98.6
antique 98.3
vase 98.2
container 97.3
ancient 96.1
art 96
vintage 94.8
simplicity 94
old 93.8
handmade 93.8
artistic 92.9
watercraft 90.7
one 90.6

Imagga
created on 2018-12-21

vessel 100
container 100
pitcher 100
jug 61.1
vase 37.1
water jug 36.3
drink 34.2
pot 30.8
bottle 29.8
cup 24.4
handle 22.9
object 22.7
traditional 22.5
ceramic 22.3
beverage 21.7
glass 21
liquid 20.9
pottery 20.6
culture 20.5
antique 19.9
ceramics 18.6
old 17.4
decoration 17.4
craft 17.2
tea 16.9
jar 16.8
decorative 16.7
clay 16.6
ancient 16.4
mug 16.3
milk 15.3
kitchen 15.2
breakfast 15
coffee 14.8
classical 14.3
art 14.3
food 13.9
history 12.5
single 12.3
retro 12.3
ornate 11.9
terracotta 11.9
brown 11.8
earthenware 10.9
utensil 10.8
vintage 10.7
healthy 10.7
ornament 10.3
black 10.2
refreshment 10
teapot 9.8
pouring 9.7
health 9.7
style 9.6
cold 9.5
close 9.1
china 8.9
beer 8.7
water 8.7
empty 8.6
nobody 8.5
design 8.4
hot 8.4
transparent 8.1
restaurant 7.8
dairy 7.7
obsolete 7.7
pattern 7.5
product 7.5
alcohol 7.4
metal 7.2
morning 7.2
color 7.2
porcelain 7.2
fresh 7.2
life 7

Google
created on 2018-12-21

Microsoft
created on 2018-12-21

indoor 91.7
vessel 89.7
black 80.1
white 73.9
plant 56.6
single 53.6
vase 53.6
pottery 23.3
ceramic 5.7

Color Analysis

Feature analysis

Amazon

Cup 99.2%

Categories

Imagga

food drinks 93.8%
interior objects 5.8%