Human Generated Data

Title

Sherd

Date

-

People

-

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.110

Human Generated Data

Title

Sherd

Classification

Fragments

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. George C. Scanlon, 1970.157.110

Machine Generated Data

Tags

Amazon
created on 2019-07-08

Clarifai
created on 2019-07-08

no person 97.9
art 97
still life 94.1
food 93.2
one 92.5
cutout 90.9
retro 88.6
grow 87.3
texture 87.2
desktop 86.5
old 86
container 84.4
wood 81.1
nature 79.2
stone 79.2
paper 79.1
color 78.9
sea 78.3
simplicity 77.2
traditional 77

Imagga
created on 2019-07-08

earthenware 67.5
ceramic ware 50.8
majolica 39.4
utensil 33.4
container 33.1
food 23.8
pot 21
close 19.4
breakfast 18.6
bread 18.5
brown 18.4
fresh 18.3
meal 17.8
snack 17.1
healthy 16.4
dinner 15.1
lunch 14.6
vessel 14.1
diet 13.7
closeup 13.5
drink 13.4
slice 12.7
eat 12.6
object 12.5
nobody 12.4
delicious 12.4
nutrition 11.7
color 11.7
natural 11.4
tasty 10.9
tea 10.4
cup 10.1
beverage 10
vase 9.7
life 9.4
gourmet 9.3
hot 9.2
wood 9.2
paper 8.6
toast 8.6
plate 8.5
ice 8.4
eating 8.4
organic 8.4
health 8.3
pattern 8.2
dessert 8.1
morning 8.1
water 8
wooden 7.9
loaf 7.8
saucer 7.8
ceramic 7.7
flower 7.7
piece 7.6
bakery 7.6
wheat 7.6
plant 7.5
shape 7.5
single 7.4
full 7.3
yellow 7.3
cut 7.2
holiday 7.2
kitchen 7.2
pottery 7.1

Google
created on 2019-07-08

Turquoise 66.2

Microsoft
created on 2019-07-08

piece 78.1
turquoise 69.5
art 68.3
half 67
vase 62.9

Color Analysis

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2019-07-08

a piece of cake 38.5%