Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.51

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.51

Machine Generated Data

Tags

Amazon
created on 2019-07-08

Ornament 91.8
Accessories 91.1
Accessory 91.1
Jewelry 90.3
Gemstone 86.5
Pottery 74.5
Diamond 66.5
Agate 59.9
Figurine 58.7
Rock 56.9

Clarifai
created on 2019-07-08

desktop 97.9
souvenir 97.6
food 96.3
decoration 95.9
pattern 95.2
shape 94.7
disjunct 94.4
stone 94.1
art 94.1
pottery 93.3
symbol 93.1
one 92.8
no person 92.7
jewelry 92.3
ceramic 92.1
old 90.9
precious 90.8
grow 90.2
clay 90
texture 89.5

Imagga
created on 2019-07-08

earthenware 54.5
ceramic ware 41.5
utensil 27.1
majolica 24.6
container 22.3
close 16.5
brown 14
gold 14
decoration 12.7
money 11.9
bag 11.5
golden 11.2
animal 11
closeup 10.8
vase 10.4
object 10.3
symbol 10.1
currency 9.9
jewel 9.7
color 9.4
ring 9.4
luxury 9.4
finance 9.3
bank 9
food 8.7
gift 8.6
chocolate 8.6
purse 8.1
wealth 8.1
turtle 8.1
natural 8
gem 7.8
frog 7.7
design 7.6
tasty 7.5
candy 7.5
savings 7.5
banking 7.3
jar 7.2
holiday 7.2
transparent 7.2

Google
created on 2019-07-08

Rock 81.8
Geology 68
Ceramic 65.2
Mineral 60.7

Microsoft
created on 2019-07-08

skull 86.2
fossil 67.4
bone 52.5
ceramic ware 27.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-13
Gender Male, 89.5%
Happy 1%
Disgusted 2.5%
Confused 7.2%
Sad 28.9%
Calm 35.2%
Surprised 7.3%
Angry 17.9%

Categories

Imagga

paintings art 95.8%
food drinks 4.1%

Captions

Microsoft
created on 2019-07-08

a piece of food 55.5%
a piece of bread 55.4%
a close up of food 55.3%