Human Generated Data

Title

Statuette

Date

People
Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from the Department of the Classics, Harvard University, Bequest of Henry W. Haynes, 1912, 1977.216.2398

Human Generated Data

Title

Statuette

People
Date

Classification

Sculpture

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Bronze 93.7
Bread 82.7
Food 82.7
Archaeology 81.7
Tree 77.8
Plant 77.8
Rock 71.4
Figurine 58.2
Skin 57.7
Art 57.3
Fossil 55.6

Imagga
created on 2022-01-29

bread 35.3
food 33.2
meal 26
loaf 24.3
brown 23.5
baked 22.5
breakfast 21.2
wheat 19.5
bakery 19.1
healthy 18.3
ancient 18.2
crust 16.4
eating 16
tasty 15.9
fresh 15.7
grain 15.7
lunch 15.4
diet 15.3
old 15.3
delicious 14.9
close 14.8
slice 14.5
antique 14.1
cereal 13.8
rough 13.7
grunge 13.6
flour 13.5
eat 13.4
tree 12.8
snack 12.8
natural 12.7
dirty 12.6
nutrition 12.6
garment 12.1
paper 11.8
vintage 11.6
whole 11.5
retro 11.5
closeup 11.4
dinner 10.9
nutritious 10.4
texture 10.4
vegetable 10.3
organic 10.1
root vegetable 9.8
bake 9.6
sliced 9.6
stole 9.5
blank 9.4
gourmet 9.3
yellow 9.3
traditional 9.1
object 8.8
cooking 8.7
empty 8.6
weathered 8.5
stone 8.4
aged 8.1
history 8
clothing 8
textured 7.9
plant 7.9
baker 7.8
forest 7.8
rye 7.8
nobody 7.8
baking 7.7
past 7.7
scarf 7.5
wood 7.5
dry 7.4
page 7.4
meat 7.2
cuisine 7.1

Google
created on 2022-01-29

Plant 91.9
Tree 86.9
Wood 85.8
Dress 84.9
Art 83.1
Painting 79.7
Trunk 78
Pattern 74.8
Camouflage 73.5
Human leg 71.8
Artifact 70.6
Drawing 70
Illustration 69.6
Fashion design 69.5
Visual arts 67.2
Rock 59.8
Moth 57
Fossil 52.3
Artwork 52.1
Pattern 51.6

Feature analysis

Amazon

Bread 82.7%

Captions

Microsoft

a close up of a person 55%
close up of a person 47.4%
a person posing for a photo 37.7%