Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.611

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.611

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Pottery 93
Art 93
Porcelain 93
Crystal 91
Wood 86.3
Lamp 82.2
Furniture 77.6
Tabletop 77.6
Jar 67.6
Vase 67.6
Soil 64.6
Mineral 63

Clarifai
created on 2019-07-07

no person 98.8
one 98
still life 97.9
food 95.6
grow 95.2
cutout 94.7
wear 92.9
dairy product 90.4
container 86.5
two 83
jewelry 82.4
seashell 79
empty 77.8
simplicity 77.4
three 77.2
invertebrate 73.8
shell 73.8
broken 72.9
paper 71.6
art 71.1

Imagga
created on 2019-07-07

lampshade 48.6
shade 37.8
food 36.5
protective covering 28.6
tea 27.6
cheese 24.6
brown 19.9
covering 19.7
slice 19.1
bread 18.5
cone 17.8
dessert 17.6
fresh 17
snack 14.5
sweet 14.2
plate 13.9
healthy 13.9
eat 13.4
breakfast 13.3
nobody 13.2
delicious 13.2
close 13.1
gourmet 12.7
yellow 12.6
health 12.5
object 11.7
meal 11.4
diet 11.3
baked 11.2
eating 10.9
nutrition 10.9
candy 10.8
single 10.7
lunch 10.3
toast 10.1
closeup 10.1
dairy 9.7
sandwich 9.7
cooking 9.6
bakery 9.5
piece 9.5
sugar 9.4
chocolate 9
pastry 8.9
cuisine 8.9
loaf 8.7
wheat 8.6
fat 8.4
tasty 8.4
color 8.3
freshness 8.3
fruit 7.9
butter 7.9
unhealthy 7.7
dinner 7.6
soft 7.2
meat 7.2
paper 7.1
glass 7

Google
created on 2019-07-07

Ceramic 53.3

Microsoft
created on 2019-07-07

food 82.1
piece 76.1
slice 69.9
white 65.7

Color Analysis

Feature analysis

Amazon

Lamp 82.2%

Categories

Imagga

paintings art 97.6%
food drinks 1.4%

Captions