Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.502

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.502

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Accessory 93.3
Accessories 93.3
Jewelry 86.8
Ornament 80.4
Animal 76.7
Fish 76.7
Gemstone 76.4
Bread 74
Food 74
Pottery 70.1

Clarifai
created on 2019-07-07

art 98
one 98
desktop 96.3
no person 96.1
wear 94.7
texture 94.3
color 94.2
jewelry 94.1
cutout 93.9
old 93.4
container 93
pattern 92.9
design 91.5
retro 90.2
decoration 89.8
vintage 89.2
still life 88.4
pottery 88.1
money 87.8
hand 85.9

Imagga
created on 2019-07-07

bag 100
container 100
purse 98.1
wallet 33
case 23.8
box 21.4
money 21.3
object 21.3
fashion 18.1
currency 17.1
shopping 16.5
gift 16.3
accessory 16.2
close 15.4
finance 15.2
pencil box 14.5
wealth 14.4
leather 14.3
decoration 13.8
single 13.2
cash 12.8
brown 12.5
color 12.2
luxury 12
paper 11.8
gold 11.5
mailbag 11.5
new 11.3
celebration 11.2
dollar 11.1
nobody 10.9
retail 10.5
style 10.4
business 10.3
savings 10.3
rich 10.2
casual 10.2
bank 9.9
clothing 9.8
banknote 9.7
hundred 9.7
design 9.6
personal 9.6
golden 9.5
closeup 9.4
elegance 9.2
present 9.1
black 9
shiny 8.7
dollars 8.7
holiday 8.6
pink 8.4
ribbon 8.3
pattern 8.2
success 8
detail 8
financial 8
pocket 7.8
paying 7.8
jewelry 7.7
cloth 7.7
pay 7.7
textile 7.6
fabric 7.4
open 7.2
material 7.2

Google
created on 2019-07-07

earthenware 85.9
Ceramic 84.6
Artifact 82.1
Rock 64.4
Pottery 63.5
Art 50.2

Microsoft
created on 2019-07-07

animal 94.2
reptile 90.2
turtle 88.8

Color Analysis

Feature analysis

Amazon

Fish 76.7%
Bread 74%

Categories

Imagga

paintings art 99.6%

Captions

Microsoft
created on 2019-07-07

a turtle on a white surface 40%
a close up of a turtle 39.9%
close up of a turtle 39.8%