Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.539

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.539

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Pottery 98
Porcelain 98
Art 98
Jar 95.3
Vase 91.5
Bread 80.9
Food 80.9
Archaeology 64.6
Urn 58.2

Clarifai
created on 2019-07-07

no person 95.8
one 94.7
cutout 94.5
desktop 92.7
still life 91.6
decoration 90.7
container 89.6
food 89.3
art 89.2
sculpture 86.9
shape 85.8
old 85.4
two 84.1
simplicity 83.9
stone 83.1
single 82.8
disjunct 81.5
grow 81.4
closeup 81.1
round out 80.9

Imagga
created on 2019-07-07

food 26.3
chocolate 23.5
snack 22.2
brown 19.9
dessert 18.5
sweet 17.4
hole 17.4
delicious 16.5
money 14.5
tasty 14.2
pastry 14.2
baked 14.1
sugar 14.1
gourmet 13.6
closeup 12.8
old 12.5
dark 12.5
nobody 12.4
bakery 12.4
treat 12.4
healthy 12
candy 11.8
close 11.4
slice 10.9
eat 10.9
currency 10.8
diet 10.5
cream 10.4
cake 10.3
black 10.2
eating 10.1
decoration 10
bark 9.5
bread 9.3
savings 9.3
finance 9.3
fresh 9.1
iron 9
wealth 9
meal 8.9
object 8.8
unhealthy 8.6
coin 8.6
piece 8.5
dinner 8.4
nutrition 8.4
fat 8.4
antique 8.4
device 8.4
investment 8.2
fruit 8.2
bank 8.1
liquid 7.9
business 7.9
cookie 7.8
whole 7.6
taste 7.4
single 7.4
banking 7.4
cash 7.3
vessel 7.2
financial 7.1
ingredient 7.1
breakfast 7.1
paper 7.1

Google
created on 2019-07-07

Ceramic 91.4
earthenware 91.1
Pottery 87.7
Tree 74.7
Artifact 73.3
Art 69.5
Beige 63.7
Rock 54.2
Craft 51.4
Vase 50.7

Microsoft
created on 2019-07-07

snake 79.5
reptile 64.8
ceramic ware 32.7
stoneware 19.3

Color Analysis

Feature analysis

Amazon

Bread 80.9%

Categories

Imagga

paintings art 79.1%
pets animals 15.6%
food drinks 4.8%

Captions