Human Generated Data

Title

Askos

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of David M. Robinson, 1960.398

Human Generated Data

Title

Askos

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of David M. Robinson, 1960.398

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Plant 94.7
Food 90
Vegetable 88.5
Produce 87.1
Pumpkin 84.2
Gourd 64.5

Clarifai
created on 2023-10-27

clay 99.8
art 99.1
no person 98.8
pottery 98.8
one 97.6
container 95.1
desert 93.4
mud 93.1
sculpture 92.7
isolated 92.4
handmade 92.1
retro 92
broken 89.7
hand 89.7
ceramic 89.6
toy 89.6
dry 88.8
nature 88.6
man 88.3
still life 88

Imagga
created on 2022-01-29

pitcher 85.5
grenade 77.2
vessel 68.3
bomb 61.7
container 55.5
explosive device 46.8
weaponry 30.5
orange 16.1
old 16
earthenware 14.9
device 14.8
vase 14.7
object 14.6
brown 14
ball 14
decoration 13.7
pumpkin 13.7
yellow 13.2
art 11.7
sport 11.5
autumn 11.4
culture 11.1
fall 10.8
traditional 10.8
holiday 10.7
single 10.7
antique 10.5
texture 10.4
round 10.3
decorative 10
vintage 9.9
jar 9.7
close 9.7
utensil 9.7
soccer 9.6
black 9.6
football 9.6
jug 9.5
ancient 9.5
leather 9.5
basket 9.5
competition 9.1
game 8.9
ceramic ware 8.8
basketball 8.8
play 8.6
food 8.4
rough 8.2
retro 8.2
team 8.1
wooden 7.9
seasonal 7.9
scary 7.7
pattern 7.5
harvest 7.5
wood 7.5
one 7.5
man 7.4
sphere 7.3
light 7.3
paper 7

Google
created on 2022-01-29

Creative arts 84.8
Artifact 84.3
Art 83.8
Pottery 81.4
Wood 81.3
Vase 77.7
earthenware 75.6
Serveware 74.7
Bag 74.3
Clay 74.3
Carmine 61.8
Peach 61.2
Fashion accessory 59.8
Visual arts 59
Plywood 57.3
Ceramic 55.6
Craft 53.7
Home accessories 53.6
Circle 50.7
Pattern 50.4

Microsoft
created on 2022-01-29

brown 85.8
art 74.5
vessel 74.1
plant 72.6
jar 60.7
ceramic ware 33.8

Color Analysis

Categories

Imagga

interior objects 95.6%
paintings art 2.4%
food drinks 1.9%