Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.193

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.193

Machine Generated Data

Tags

Amazon
created on 2019-07-08

Accessories 98.4
Accessory 98.4
Jewelry 95
Gemstone 94.6
Turquoise 87.7
Ornament 83.6
Jade 83.6

Clarifai
created on 2019-07-08

desktop 96.2
art 94.1
texture 92.6
disjunct 91.1
no person 90.9
nature 89.1
color 88.5
one 85.7
bright 85
stone 83.7
clean 81.6
traditional 80
closeup 79.5
pattern 79.4
design 79.1
food 79
sea 78.9
funny 78.2
single 78.2
shape 76.1

Imagga
created on 2019-07-08

fabric 25.5
texture 20.1
color 15
lace 15
container 13.9
bag 13.6
pink 13.4
object 13.2
close 13.1
relief 12.9
paper 12.5
purse 11.8
fresh 11.8
decoration 11.6
colorful 11.5
nobody 10.9
food 10.9
cap 10.8
soft 10.8
velvet 10.5
piece of cloth 10.5
love 10.3
clean 10
valentine 10
clothing 10
single 9.9
watercolor 9.7
closeup 9.4
fat 9.3
art 9.3
design 9
hat 9
symbol 8.8
bathing cap 8.6
towel 8.5
money 8.5
freshness 8.3
stack 8.3
one 8.2
pattern 8.2
shape 8.2
meal 8.1
backgrounds 8.1
ingredient 7.9
objects 7.8
wool 7.8
orange 7.7
old 7.7
grunge 7.7
gourmet 7.6
headdress 7.4
cotton 7.4
piece 7.4
dish 7.2
meat 7.2
celebration 7.2
bright 7.1
cuisine 7.1
breakfast 7.1

Google
created on 2019-07-08

Microsoft
created on 2019-07-08

ceramic 74.1
accessory 54
turquoise 53.8
enamel 18.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 94%
Confused 0.6%
Disgusted 0.3%
Angry 0.6%
Sad 94.9%
Happy 1.5%
Calm 1.6%
Surprised 0.4%

Categories

Captions