Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.541

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.541

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Food 71.6
Bread 71.6
Crystal 65.1
Soap 60
Ivory 57.8
Cuff 55.8

Clarifai
created on 2019-07-07

desktop 96.8
single 95.1
isolated 94.4
stranded 94
disjunct 93.6
design 93.4
no person 93
pattern 92
food 90.8
color 89.7
symbol 89.7
shape 89.2
individual 89.1
decoration 88.2
texture 87.3
little 86.6
business 86.4
traditional 86
one 85.8
old 84.6

Imagga
created on 2019-07-07

food 17.8
shell 17.4
close 17.1
brown 15.5
turtle 15.3
tortoise 13.8
relief 13
bag 13
reptile 12.8
stamp 12.7
slow 12.7
animal 12.4
paper 11.8
closeup 11.5
old 11.2
snack 11.1
object 11
sculpture 10.9
texture 10.4
die 10.4
money 10.2
nobody 10.1
container 9.8
fresh 9.2
protection 9.1
art 8.8
natural 8.7
nut 8.6
finance 8.4
nutrition 8.4
health 8.3
rough 8.2
decoration 8.2
symbol 8.1
currency 8.1
wildlife 8
tea 7.8
objects 7.8
shaping tool 7.8
scale 7.7
sign 7.5
dollar 7.4
gold 7.4
investment 7.3
cash 7.3
slice 7.3
wealth 7.2
celebration 7.2
bank 7.2
sweet 7.1
box 7.1
life 7
wood 7

Google
created on 2019-07-07

Sculpture 78.8
Beige 78.4
Artifact 60.5
Ceramic 58.5
Rock 54.2
Art 50.2

Color Analysis

Feature analysis

Amazon

Bread 71.6%

Categories

Imagga

paintings art 98.3%
food drinks 1.3%

Captions

Microsoft
created on 2019-07-07

a close up of a yellow wall 31.7%
a close up of a street 31.6%
a yellow and white text 30.2%