Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.405.D

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.405.D

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Arrowhead 96.8

Clarifai
created on 2019-07-07

no person 99.9
one 97.8
food 95.7
still life 95.5
grow 93.7
wood 92.9
paper 89
dairy product 88.2
cutout 87
triangle 86.6
winter 86.1
blur 85.5
simplicity 80.7
nature 80.1
texture 78.2
conceptual 77.2
cold 75.9
disjunct 74.9
two 74.9
snow 74.5

Imagga
created on 2019-07-07

shovel 49.8
tea 34.5
hand tool 31.8
tile 27.1
tool 25.9
paper 25.1
blank 18.9
note 18.4
yellow 17.9
food 16.3
close 16
brown 14
object 13.9
empty 13.7
old 13.2
piece 13
grunge 12.8
ray 12.8
page 12.1
snack 12
message 11.9
fire iron 11.4
stingray 11.2
board 11.2
texture 11.1
notice 10.7
sign 10.5
post 10.5
nobody 10.1
wood 10
slice 10
frame 10
pad 9.8
meal 9.7
office 9.6
horizontal 9.2
vintage 9.1
aged 9
dirty 9
breakfast 8.8
textured 8.8
closeup 8.8
cheese 8.8
symbol 8.8
envelope 8.7
notes 8.6
book 8.4
document 8.4
rough 8.2
detail 8
wooden 7.9
design 7.9
text 7.9
antique 7.8
sticky 7.8
reminder 7.8
container 7.8
space 7.8
announcement 7.7
stick 7.7
eat 7.6
bread 7.4
single 7.4
grain 7.4
open 7.2
information 7.1

Google
created on 2019-07-07

Artifact 55.7

Microsoft
created on 2019-07-07

piece 87.8
slice 74.6
half 57
eaten 22.5

Color Analysis

Categories

Imagga

paintings art 98.9%

Captions

Microsoft
created on 2019-07-07

a knife on a white surface 51.7%
a half eaten apple 42.9%
a close up of a knife 42.8%