Human Generated Data

Title

Bull

Date

-

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Miss Bettina J. Kahnweiler, 1935.35.6

Human Generated Data

Title

Bull

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Miss Bettina J. Kahnweiler, 1935.35.6

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Animal 88.8
Mammal 87.8
Figurine 85
Statue 81.3
Art 81.3
Sculpture 81.3
Wildlife 76.7
Pottery 67.1
Archaeology 64.3
Bronze 59
Dinosaur 56.3
Reptile 56.3
Tadpole 55.6
Amphibian 55.6

Clarifai
created on 2023-10-26

no person 99.2
prehistoric 98.2
sculpture 98
one 97.8
art 97.4
nature 97.4
stone 97.3
rock 97
still life 96
old 95.4
side view 90.1
ancient 89
sky 88.8
broken 87.8
simplicity 87.8
isolated 87.5
two 87.4
museum 86.7
cutout 84.2
animal 84.1

Imagga
created on 2022-01-22

moth 26.5
device 26.3
tool 25.4
wrench 21.8
insect 21.1
metal 20.9
tooth 20.7
steel 17.7
invertebrate 17
animal 16.8
construction 15.4
arthropod 14.8
work 14.1
iron 14
object 13.9
spanner 13.7
support 13
close 12.5
repair 12.4
corbel 12.4
equipment 12
old 11.8
food 10.5
closeup 10.1
bracket 9.9
detail 9.6
tools 9.5
industry 9.4
single 9
rusty 8.6
chrome 8.5
hand 8.3
metallic 8.3
industrial 8.2
gray 8.1
black 7.8
mechanic 7.8
texture 7.6
engineering 7.6
power 7.5
wood 7.5
human 7.5
baked 7.5
dirty 7.2
cut 7.2
sea 7.1
sea hare 7.1

Google
created on 2022-01-22

Color Analysis

Feature analysis

Amazon

Dinosaur 56.3%

Categories

Imagga

pets animals 93.2%
nature landscape 4.4%
food drinks 1.9%

Captions