Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.121

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.121

Machine Generated Data

Tags

Amazon
created on 2019-07-08

Fossil 97.8
Soil 88.5
Pottery 77.3
Text 63.6
Archaeology 62.5
Plant 58.9
Rock 55.2

Clarifai
created on 2019-07-08

no person 98.1
art 97.9
one 97.6
jewelry 96.6
decoration 96.5
pottery 95.9
pattern 95.8
desktop 95.7
food 95.6
container 94.7
shape 93.4
painting 93.4
disjunct 93.2
cutout 92.7
clay 92
color 91.8
illustration 91.1
design 91.1
ceramic 91.1
still life 91

Imagga
created on 2019-07-08

bag 50.5
container 47.9
box 29.2
gift 24.1
purse 22.9
stamp 22.1
gold 20.5
die 19
decoration 18.7
present 17.3
celebration 16.7
love 15
object 14.7
holiday 14.3
jewelry 13.8
golden 13.8
shaping tool 13.5
brown 13.2
valentine 12.7
shiny 12.7
gem 12.1
luxury 12
close 12
jewel 11.6
mailbag 11.5
closeup 11.5
bow 11.4
ribbon 11.1
ring 10.5
surprise 10.4
shape 10.3
food 9.8
texture 9.7
paper 9.7
package 9.7
piggy bank 9.6
ornament 9.5
anniversary 9.4
money 9.4
birthday 9.3
romance 8.9
symbol 8.8
precious 8.7
fastener 8.7
occasion 8.7
earthenware 8.6
snack 8.5
color 8.3
tool 8.2
fabric 8.2
celebrate 8.1
wealth 8.1
objects 7.8
diamond 7.7
old 7.7
chest 7.6
shopping 7.3
yellow 7.3
open 7.2
design 7.1

Google
created on 2019-07-08

Brown 81.2
Rock 73.4
Beige 72.4
Font 68.6
Geology 68
Illustration 50.4
Art 50.2

Microsoft
created on 2019-07-08

vase 71.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 93.3%
Calm 61.9%
Surprised 1.8%
Disgusted 1.4%
Sad 8.6%
Angry 8%
Confused 5.5%
Happy 12.9%

Categories

Imagga

food drinks 93.8%
paintings art 5.9%

Captions