Human Generated Data

Title

Arretine Ware Bowl

Date

100 BCE-100 CE

People

-

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.3113

Human Generated Data

Title

Arretine Ware Bowl

Date

100 BCE-100 CE

Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.3113

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Bowl 96
Pottery 88.8
Porcelain 77.7
Art 77.7
Soup Bowl 72.8

Clarifai
created on 2023-10-29

no person 99.2
empty 97.5
simplicity 96.9
chrome 95
glazed 94.6
monochrome 94.2
round out 93.7
still life 92.3
isolated 92.1
minimalism 91.1
steel 87.5
aluminum 87.5
reflection 86.9
Zen 86
art 84.9
ceramic 84.8
desktop 84.3
clean 83.9
round 82.8
balance 82.3

Imagga
created on 2022-06-10

lens cap 82.2
cap 64.4
pan 55.3
protective covering 49.3
frying pan 41.9
cooking utensil 35.3
covering 32.6
kitchen utensil 23.5
black 20.4
utensil 20.3
spa 19.7
kitchen 19.7
wok 19.6
circle 19.4
health 16.7
close 16.6
disk 15.9
hot 15.1
equipment 14.5
stone 14.3
stove 14
cooking 13.1
food 12.7
object 12.5
rock 12.2
oven 12.1
music 11.7
stack 11.1
record 10.7
reflection 10.6
metal 10.5
balance 10.4
therapy 10.4
harmony 10.3
sound 10.3
dutch oven 10.3
heat 10.2
plate 10.2
kitchenware 10.1
relaxation 10.1
water 10
disc 9.7
liquid 9.6
drink 9.2
treatment 9.2
puck 9.1
burner 9
japan 9
technology 8.9
vinyl 8.9
objects 8.7
shiny 8.7
empty 8.6
appliance 8.4
purity 8.3
cook 8.2
drop 8.2
closeup 8.1
pebble 8.1
gas 7.7
musical 7.7
stones 7.6
healthy 7.6
clean 7.5
entertainment 7.4
natural 7.4
meal 7.3
detail 7.2
concepts 7.1
bowl 7.1
steel 7.1
medicine 7

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

sky 99.2
text 75
tableware 73.4
white 66.7
design 66.4
silver 65.4
black and white 61.4

Color Analysis

Categories

Imagga

interior objects 100%

Captions

Text analysis

Amazon

MADO
EVERY