Human Generated Data

Title

Ram

Date

-

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.117

Human Generated Data

Title

Ram

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.117

Machine Generated Data

Tags

Amazon
created on 2019-07-07

Figurine 99.9
Bronze 70
Art 68
Sculpture 67.7
Statue 67.7
Rust 59.8

Clarifai
created on 2019-07-07

mammal 97.8
animal 95.5
no person 94.9
one 91.1
isolated 88.8
funny 88.1
desktop 87.9
cute 86.5
illustration 84.9
art 80.5
wildlife 79.9
toy 79.4
cat 78.7
prehistoric 76.2
head 74
portrait 73.6
lion 73.4
nature 73
dog 73
pet 73

Imagga
created on 2019-07-07

animal 50.9
cat 47.9
baby 43.9
studio 39.5
predator 35.3
mammal 32.9
fur 32.8
cute 32.3
feline 28.8
ear 27.5
pet 22.9
lion 22.6
wildlife 22.3
animals 21.3
domestic 20.8
furry 20.2
beef 20
eye 19.6
carnivore 19.5
wild 19.1
kitten 18.4
looking 18.4
whisker 17.7
cub 16.8
piggy 15.4
pets 15.3
cats 14.7
staring 14.6
breed 14.5
bank 14.4
face 14.2
standing 13.9
adorable 12.9
lion cub 12.9
pig 12.5
brown 12.5
dog 12.3
savings 12.1
money 11.9
undomesticated 11.9
months 11.8
purebred 11.6
cut out 11.4
save 11.4
funny 11
creature 10.6
economy 10.2
finance 10.1
canine 10
piggy bank 9.6
investment 9.2
kitty 9
curious 8.7
eyes 8.6
sitting 8.6
close 8.6
nobody 8.5
male 8.5
puppy 8.4
banking 8.3
gold 8.2
childhood 8.1
hair 7.9
lying down 7.9
paw 7.8
portrait 7.8
tail 7.7
one 7.5
piglet 7.4
business 7.3
wealth 7.2
currency 7.2
cartoon 7.1

Google
created on 2019-07-07

Animal figure 93.4
Figurine 75.3
Carving 71.2
Sheep 69.7
Sheep 68.6
Stone carving 66.8
Sculpture 62
Statue 59.7
Cow-goat family 58.8
Livestock 58.6
Fawn 53
Artifact 51.7
Art 50.2

Microsoft
created on 2019-07-07

animal 92.9
statue 78.7
animal figure 69.9

Color Analysis

Categories

Imagga

pets animals 99.4%

Captions

Microsoft
created on 2019-07-07

a brown and white animal 39.6%
a close up of a bear 26.9%