Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.379

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.379

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Fish 91.7
Animal 91.7
Shark 91.7
Sea Life 91.7
Jewelry 91.1
Accessory 91.1
Accessories 91.1
Gemstone 91.1
Pottery 70.4
Porcelain 70.4
Art 70.4
Sapphire 56.9

Clarifai
created on 2019-07-07

cutout 98.3
desktop 97.8
no person 96.7
one 96.6
color 95.7
food 93.6
still life 91.9
group 90.2
vintage 90.1
closeup 89.9
plate 88.8
healthy 88.6
triangle 88.4
grow 87.9
kitchenware 87.6
container 87.4
symbol 86.9
table 85.6
luxury 85.4
pattern 85

Imagga
created on 2019-07-07

handkerchief 100
piece of cloth 86.4
fabric 62.8
container 26
paper 25.1
blank 15.4
wallet 14.9
food 14.8
object 14.6
vintage 14
old 13.9
page 13.9
grunge 13.6
nobody 13.2
purse 12.9
gourmet 12.7
bag 11.4
ancient 11.2
texture 11.1
brown 11
business 10.9
cone 10.7
color 10.6
antique 10.4
close 10.3
pattern 9.6
empty 9.4
money 9.4
eat 9.2
note 9.2
delicious 9.1
aged 9
backgrounds 8.9
tea 8.9
cheese 8.8
case 8.6
yellow 8.6
snack 8.5
finance 8.4
health 8.3
retro 8.2
cuisine 8
text 7.9
fresh 7.8
black 7.8
diary 7.8
delicatessen 7.8
lunch 7.7
worn 7.6
plate 7.6
dinner 7.6
book 7.5
thimble 7.5
fat 7.5
document 7.4
design 7.3
meal 7.3
diet 7.3
meat 7.2
open 7.2
art 7.2

Google
created on 2019-07-07

Microsoft
created on 2019-07-07

turquoise 53.8
ceramic ware 39.6
plate 31.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 91.4%
Calm 24.2%
Surprised 5.8%
Angry 3.1%
Happy 53.6%
Disgusted 1.5%
Confused 5%
Sad 6.8%

Feature analysis

Amazon

Shark 91.7%

Categories

Imagga

paintings art 97.3%
pets animals 1.4%