Human Generated Data

Title

Bowl

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Nanette B. Rodney, 1978.70

Human Generated Data

Title

Bowl

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Nanette B. Rodney, 1978.70

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Bowl 100
Pottery 79.4
Soup Bowl 78.3
Porcelain 77.7
Art 77.7

Clarifai
created on 2023-10-29

monochrome 99.7
art 99.3
no person 98.8
bowl 97.7
pottery 97.5
food 96.4
black and white 95.2
empty 95.2
plate 95.1
still life 94.7
clay 93.1
vintage 92.4
one 92.2
container 91.9
shadow 90.8
dish 90.2
minimalism 89.7
vase 89
texture 88.9
ceramic 88.8

Imagga
created on 2022-06-10

bowl 100
container 80.7
vessel 74.5
china 35.5
porcelain 28.5
mixing bowl 27.1
cup 26.1
utensil 24
ceramic ware 23.9
soup bowl 20.5
glass 19.4
dish 19.4
drink 17.5
kitchen 15.2
food 15.1
close 14.8
object 13.9
tableware 13.7
beverage 12.5
water 12
hot 11.7
cooking 11.3
pot 11.3
metal 11.3
nobody 10.9
color 10.6
old 10.4
tea 10.3
health 9.7
empty 9.4
healthy 9.4
cook 9.1
black 9
reflection 8.9
breakfast 8.8
alcohol 8.8
ingredient 8.8
earthenware 8.8
closeup 8.8
liquid 8.7
natural 8.7
fresh 8.5
washbasin 8.5
stone 8.4
vase 8.2
transparent 8.1
shiny 7.9
clear 7.8
art 7.8
basin 7.7
mortar 7.6
wood 7.5
meal 7.3
decoration 7.2
wooden 7

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

wall 96.6
indoor 95.3
vase 90.2
earthenware 83.5
black 70
pottery 69.2
ceramic 67.9
mixing bowl 67.4
tableware 66.9
ceramic ware 64.1
dishware 46.3
bowl 45.7
stoneware 27
porcelain 12.9

Color Analysis

Categories

Captions

Microsoft
created on 2022-06-10

a close up of a bowl 83.4%
close up of a bowl 77.6%
a close up of a large bowl 73.9%