Human Generated Data

Title

Alms Bowl with Inscription

Date

mid 20th century

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Holmes H. Welch, 1981.78.5.1

Human Generated Data

Title

Alms Bowl with Inscription

Date

mid 20th century

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Holmes H. Welch, 1981.78.5.1

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Bowl 100
Pottery 97.5
Soup Bowl 89.9
Art 83.6
Porcelain 83.6
Jar 66.4
Vase 66.4

Clarifai
created on 2019-07-06

no person 97.6
desktop 95.2
cutout 94.9
one 94.7
tableware 94.6
container 94.4
still life 94.3
food 92.9
color 92.7
empty 89.7
pottery 89.7
kitchenware 87.5
studio 86.5
shape 85.6
single 82.7
simplicity 82.4
isolated 82.2
closeup 82
two 81.6
decoration 81.2

Imagga
created on 2019-07-06

earthenware 68
bangle 56.6
bowl 52.9
ceramic ware 49.1
utensil 39.9
container 29.3
food 28
vessel 24.5
object 18.3
cup 17.7
healthy 17
dish 16.4
close 16
brown 15.5
diet 15.4
kitchen 15.2
meal 14.6
color 14.5
natural 14.1
nutrition 13.4
breakfast 13.3
ingredient 13.2
cooking 13.1
decoration 13
life 12.5
gold 12.3
tea 12.2
gourmet 11.9
dinner 11.8
fresh 11.8
tasty 11.7
nobody 11.7
black 11.4
egg 11.4
shiny 11.1
drink 10.9
pot 10.8
delicious 10.7
ceramic 10.7
yellow 10.6
empty 10.3
cream 10.3
sweet 10.3
snack 10.3
plate 10.2
eating 10.1
organic 10.1
eat 10.1
fruit 9.8
cuisine 9.8
round 9.5
restaurant 9.5
closeup 9.4
lunch 9.4
cook 9.1
traditional 9.1
vegetarian 8.9
silver 8.8
vegetable 8.7
circle 8.6
luxury 8.6
culture 8.5
rich 8.4
hot 8.4
pottery 8
dessert 7.9
objects 7.8
glass 7.8
shell 7.7
health 7.6
wood 7.5
fat 7.5
single 7.4
slice 7.3
beverage 7.3

Google
created on 2019-07-06

Orange 87.2
earthenware 85.9
Ceramic 82.6
Vase 74.8
Pottery 70.6
Flowerpot 56.7
Artifact 55.7

Microsoft
created on 2019-07-06

sitting 95.2
plant 84
ceramic ware 72.9
dishware 51
tableware 44
bowl 26.9
porcelain 9.3

Color Analysis

Categories

Imagga

food drinks 73.9%
interior objects 26.1%

Captions

Text analysis

Amazon

F
'L