Human Generated Data

Title

Dish

Date

-

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Stuart C. Welch, 1966.129

Human Generated Data

Title

Dish

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Stuart C. Welch, 1966.129

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Tabletop 95.1
Furniture 95.1
Bowl 85.2
Pottery 80.3
Cushion 74.5
Jar 63
Porcelain 56.6
Art 56.6

Clarifai
created on 2023-10-29

monochrome 99.4
one 99.3
no person 98.4
still life 98.3
people 97.8
art 96.2
education 95.6
vehicle 93.7
adult 93.5
school 93.1
sculpture 88.5
empty 87.3
transportation system 85.1
aircraft 85
bird 84.7
food 84.4
airplane 83.7
car 82.6
two 80.8
vintage 80.7

Imagga
created on 2022-06-10

container 30.5
knife blade 29.8
knife 26.7
blade 26.3
cup 25.7
cutting implement 19.8
dagger 19.7
tool 17.4
silverware 14.6
vessel 14.5
black 14.4
close 14.3
weapon 12.9
currency 12.6
ladle 12.3
paper 12.2
money 11.9
tray 11.7
closeup 11.4
object 11
silver 10.6
empty 10.5
metal 10.5
old 10.4
cash 10.1
water 10
glass 9.9
tea 9.8
business 9.7
table 9.5
drink 9.2
saucer 8.7
equipment 8.5
utensil 8.5
finance 8.4
tableware 8.4
texture 8.3
fastener 8.2
device 8.2
bank 8.1
steel 8
receptacle 7.9
design 7.9
text 7.8
plate 7.6
clean 7.5
wood 7.5
dollar 7.4
food 7.2
decoration 7.2
shiny 7.1
breakfast 7.1
spoon 7

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

tableware 91.8
black and white 89.2
text 82.4
dishware 75.5
white 68.4
cup 41.1
bowl 17.9
porcelain 5

Color Analysis

Categories

Captions

Microsoft
created on 2022-06-10

a bowl of water 45%
a close up of a bowl of water 42.6%
a close up of a bowl 42.5%