Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.578

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.578

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Wood 92.5
Plywood 84.8
Clothing 68.4
Apparel 68.4
Hat 68.4
Cushion 56.4

Clarifai
created on 2019-07-07

no person 98.3
wear 96.4
desktop 93.7
one 93.6
cutout 91.4
wood 90.9
closeup 90.9
wooden 90.3
texture 88.2
single 87.7
old 87.6
furniture 87.2
food 83.7
stranded 83.1
pattern 82.7
retro 82
design 81.5
disjunct 81.4
vintage 80.6
isolated 80.5

Imagga
created on 2019-07-07

gastropod 74.3
mollusk 61.9
conch 40.1
invertebrate 39.7
shell 34.3
brown 26.5
animal 26.1
food 23.2
close 21.1
tile 20.6
nobody 16.3
closeup 16.2
breakfast 15.9
bread 15.7
snail 14.9
objects 14.8
delicious 14
slice 13.6
object 13.2
meal 13
detail 12.9
snack 12.8
natural 12.7
bakery 12.4
pillow 12
sea 11.8
desert 11.8
eat 11.7
tasty 11.7
wheat 11.4
piece 10.6
fresh 10.4
sweet 10.3
corbel 10.2
healthy 10.1
nutrition 10.1
yellow 9.9
cushion 9.7
golden 9.5
color 9.4
horizontal 9.2
tract 9.2
chocolate 8.8
dessert 8.8
slow 8.8
decoration 8.7
sliced 8.7
holiday 8.6
studio 8.4
grain 8.3
toast 8.2
bracket 8.1
light 8
loaf 7.8
sandwich 7.7
bake 7.7
spiral 7.6
pattern 7.5
baked 7.5
padding 7.3
paper 7.1

Google
created on 2019-07-07

Beige 78.4
Artifact 73.5
earthenware 70.1
Wood 59.6
Stone carving 57.2
Clay 55.8
Ceramic 55.6
Rock 54.2

Microsoft
created on 2019-07-07

accessory 33.5

Color Analysis

Feature analysis

Amazon

Hat 68.4%

Categories

Imagga

macro flowers 49.9%
paintings art 42.3%
food drinks 7.2%

Captions

Microsoft
created on 2019-07-07

a brown and black hat 52.4%
a close up of a hat 52.3%
a close up of a white wall 48.8%