Human Generated Data

Title

Korean Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.392

Human Generated Data

Title

Korean Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of David K. and Rita S. Jordt, 1995.1169.392

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Accessories 97.9
Accessory 97.9
Food 96.4
Bread 96.4
Jewelry 91.9
Gemstone 91.3
Bead 60.8

Clarifai
created on 2019-07-07

gem 98.5
jewelry 98.5
ceramic 97.3
pottery 97.2
desktop 97.1
crystal 96.5
no person 96.4
container 96.3
quartz 95.3
color 95.3
precious 94.7
bathroom 94.4
stone 94
closeup 91.5
shape 91
luxury 90.7
food 90.4
soap 90.4
hygiene 89.6
cutout 88.9

Imagga
created on 2019-07-07

saltshaker 100
container 83.4
shaker 83.3
whiskey jug 30.3
jug 28.6
bottle 25.1
close 20
vessel 15
closeup 12.8
water 12.7
piggy bank 12.3
food 12.3
liquid 11.9
cap 11.8
object 11.7
clean 11.7
transparent 11.6
wet 10.7
hygiene 10.4
cold 10.3
thimble 10.3
purity 10.2
color 10
yellow 9.9
health 9.7
ice 9.7
drop 9.1
shape 8.9
soap 8.7
clear 8.7
finance 8.4
freshness 8.3
refreshment 8.2
square 8.1
wealth 8.1
cool 8
glass 8
shiny 7.9
fresh 7.8
dairy 7.8
cube 7.7
snack 7.7
money 7.7
frozen 7.6
horizontal 7.5
savings 7.5
care 7.4
banking 7.4
bank 7.2
wooden 7

Google
created on 2019-07-07

Porcelain 89.5
Turquoise 80.6
Turquoise 63.5
Ceramic 62.6

Microsoft
created on 2019-07-07

cup 91.2
doughnut 89.7
pastry 69.9
cheese 56.6
half 51.3
eaten 46.3
ceramic ware 25.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 55.7%
Calm 44.6%
Sad 17.5%
Angry 12.2%
Happy 4.3%
Disgusted 13.6%
Confused 3.1%
Surprised 4.8%

Feature analysis

Amazon

Bread 96.4%

Categories

Imagga

paintings art 81.9%
food drinks 17.6%

Captions

Microsoft
created on 2019-07-07

a half eaten donut 40.5%
a half eaten doughnut 38.7%