Human Generated Data

Title

RED WARE NECK AND HANDLES OF AN AMPHORA

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.3059

Human Generated Data

Title

RED WARE NECK AND HANDLES OF AN AMPHORA

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Jug 99.1
Pottery 81.1
Stein 65
Food 63.4
Axe 62.5
Tool 62.5
Bronze 58.7
Water Jug 56.3

Imagga
created on 2022-01-23

pitcher 92.5
vessel 76
container 59.3
earthenware 49.3
ceramic ware 34.5
utensil 28.7
drink 27.5
mug 22.8
cup 22.6
tea 19.5
coffee 18.5
jug 18.4
brown 18.4
beverage 17.2
breakfast 16.8
handle 16.2
food 14.5
traditional 14.1
old 13.2
glass 13.2
black 13.2
kitchen 12.5
antique 12.3
pot 12
object 11.7
wood 11.7
culture 11.1
close 10.8
ceramic 10.6
stirrup 10.4
vase 10.1
art 10.1
cups 9.8
table 9.5
pottery 9.4
hot 9.2
morning 9
support 8.9
ceramics 8.8
teapot 8.7
ancient 8.6
restaurant 8.6
water jug 8.6
yellow 8.6
device 8.3
single 8.2
decoration 8
bottle 7.9
liquid 7.8
color 7.8
nobody 7.8
beer 7.8
craft 7.6
aroma 7.5
one 7.5
paper 7
wooden 7

Google
created on 2022-01-23

Drinkware 89.6
Tableware 87.9
Clay 86.6
Serveware 82.7
Artifact 82.1
Sleeve 80.7
Pottery 68
Sculpture 67.2
Fashion accessory 66.1
Wood 63.7
Metal 62.4
Event 61.2
Art 57.4
earthenware 53.6
Pattern 51.5
Ceramic 51.2

Microsoft
created on 2022-01-23

doughnut 98.6
vessel 68.3
ceramic ware 25.7

Color Analysis

Feature analysis

Amazon

Axe 62.5%

Captions

Microsoft

a close up of a doughnut 45.3%
a large doughnut 33.1%
a close up of a donut 33%