Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.528

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.528

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Art 99.7
Pottery 99.7
Porcelain 99.7
Food 98
Bread 98
Jar 83.5
Vase 83.5
Figurine 75.9
Crystal 56.7
Cushion 55.4

Clarifai
created on 2019-07-07

no person 97.2
one 95.5
wear 94.4
cutout 91.8
desktop 90.6
two 88.4
isolated 87.9
disjunct 86.4
fashion 86.2
footwear 84.9
still life 84.1
color 83
texture 80.7
traditional 78.2
old 74.4
art 73.9
shape 73.2
pattern 72.1
conceptual 71.4
food 71.1

Imagga
created on 2019-07-07

container 69.6
pencil box 54.8
bag 49.4
box 39.8
footwear 35.5
shoes 30.7
pair 30.2
shoe 27.8
leather 27.8
clothing 27.7
fashion 26.4
foot 25.8
purse 21.8
fastener 21.7
wear 21.1
brown 20.6
new 18.6
two 17
restraint 14.9
object 13.2
boot 12.8
lace 11.6
slide fastener 11.5
accessory 11.5
objects 11.3
cotton 11.3
sandal 11.1
elegance 10.9
close 10.9
boots 10.7
buckle 10.5
wallet 10.5
clog 10.3
human 9.8
covering 9.7
walk 9.5
healthy 9.5
clothes 9.4
nobody 9.3
male 9.2
style 8.9
color 8.9
sandals 8.9
cloth 8.6
personal 8.6
men 8.6
casual 8.5
black 8.4
old 8.4
closeup 8.1
food 7.9
rubber 7.7
orange 7.7
money 7.7
finance 7.6
sock 7.5
traditional 7.5
dress 7.2
wealth 7.2
currency 7.2
device 7.2
shiny 7.1

Google
created on 2019-07-07

Leaf 83.4
Rock 81.8
Beige 74.9
Art 50.2

Microsoft
created on 2019-07-07

butterfly 89.9

Color Analysis

Feature analysis

Amazon

Bread 98%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2019-07-07

a close up of an animal 33.2%
close up of an animal 27.5%
a close up of a logo 27.4%