Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.565.A

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.565.A

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Pottery 98.8
Porcelain 98.4
Art 98.4
Saucer 81.8
Bowl 74.2
Clothing 68.4
Apparel 68.4
Hardware 61.8
Computer 61.8
Mouse 61.8
Electronics 61.8
Vase 56
Jar 56

Clarifai
created on 2019-07-07

no person 94.8
cutout 93.9
container 93.3
desktop 92.5
one 92.2
stone 89.8
reflection 89.6
closeup 89.4
still life 89.4
art 89.2
wear 88.7
shape 88
rock 85.4
lid 85.3
color 84.2
vintage 84
old 82.3
shadow 82.1
food 81.5
business 80.2

Imagga
created on 2019-07-07

device 48.4
bag 37.2
purse 35.1
object 28.6
container 28
mouse 21.2
black 21
leather 19.2
wallet 17.2
close 17.1
money 16.2
computer 16
brake pad 15.5
business 15.2
push button 14.9
restraint 14.8
finance 14.4
personal 14.3
accessory 14.3
technology 14.1
equipment 13.5
brown 13.3
button 13.2
currency 12.6
nobody 12.4
closeup 12.1
wealth 11.7
financial 11.6
buzzer 11.3
shopping 11
cash 11
paper 11
case 10
fashion 9.8
signaling device 9.7
horizontal 9.2
clothing 9.2
optical 8.7
scroll 8.6
electronics 8.6
electronic device 8.4
key 8.4
single 8.2
work 7.9
hardware 7.7
savings 7.5
style 7.4
symbol 7.4
letter 7.3
connection 7.3
metal 7.2
color 7.2
love 7.1

Google
created on 2019-07-07

earthenware 85.5
Beige 70.8
Pottery 61.6
Ceramic 55.6
Artifact 51.7

Microsoft
created on 2019-07-07

ceramic 90.3
ceramic ware 24.3
stoneware 16.9

Color Analysis

Feature analysis

Amazon

Mouse 61.8%

Categories

Imagga

paintings art 99.4%

Captions

Microsoft
created on 2019-07-07

a close up of a white board 42.1%
close up of a white board 39.8%
a close up of a hat 39.7%