Human Generated Data

Title

Arretine Bowl, reproduction

Date

People
Classification

Vessels

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of MUSEUM OF FINE ARTS, THROUGH LACEY D. CASKEY, ESQ., 1933.38.97

Human Generated Data

Title

Arretine Bowl, reproduction

People
Date

Classification

Vessels

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Plant 99.8
Produce 97.5
Food 97.5
Vegetable 97.3
Bread 93.1
Gourd 90.8
Pumpkin 85
Turnip 80.9
Sphere 60.1

Imagga
created on 2022-01-29

pumpkin 77.3
food 43.2
vegetable 40.3
orange 35.6
earthenware 33.9
fruit 31.8
squash 31.4
autumn 26.4
ceramic ware 25.6
close 24
produce 21.2
fall 20.8
harvest 20.7
object 19.8
healthy 19.5
citrus 19.5
seasonal 19.3
organic 17.6
ripe 17.2
bread 17
fresh 17
nutrition 16.8
utensil 16.7
sweet 16.6
snack 16.2
yellow 15.9
single 15.6
closeup 14.8
tangerine 14.8
brown 14.7
color 14.5
eating 14.3
thanksgiving 13.7
ingredient 13.2
mandarin 13
holiday 12.9
plant 12.8
raw 12.5
breakfast 12.4
baked 12.2
vitamin 12.1
round 12.1
circle 12
juicy 11.8
gourd 11.8
nobody 11.7
loaf 11.7
juice 11.4
studio 11.4
lantern 11.1
gourmet 11
slice 10.9
nut 10.2
decoration 10.2
natural 10
peel 10
meal 9.7
crust 9.7
whole 9.6
light 9.4
season 9.4
horizontal 9.2
tasty 9.2
freshness 9.2
seed 9.1
cut 9
diet 8.9
bakery 8.6
walnut 8.4
eat 8.4
stem 8.4
drink 8.4
vegetarian 8.1
bright 7.9
cooking 7.9
traditional 7.5
sphere 7.4
full 7.3
agriculture 7

Google
created on 2022-01-29

Amber 85.6
Rangpur 85
Artifact 84.2
Art 82.1
Citrus 75.5
Circle 74.3
Peach 70.9
Carving 66.3
Ball 63.6
Rock 61.2
Sphere 60.9
Wood 57.8
Ceramic 57.4
Tangerine 57
Still life 56.1
earthenware 55.5
Still life photography 55.3
Creative arts 54.6

Microsoft
created on 2022-01-29

orange 53.3
painted 34.8

Feature analysis

Amazon

Bread 93.1%

Captions

Microsoft

a close up of an orange 36%

Text analysis

Amazon

73