Human Generated Data

Title

Horseman

Date

-

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of S. Cary Welch, Jr., 1958.174

Human Generated Data

Title

Horseman

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of S. Cary Welch, Jr., 1958.174

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Figurine 98.5
Porcelain 82.3
Art 82.3
Pottery 82.3
Horse 78.2
Mammal 78.2
Animal 78.2
Sculpture 70.5
Finger 56

Clarifai
created on 2023-10-29

monochrome 98.3
one 98
animal 96.9
people 96.9
no person 96.2
art 95.5
mammal 94.9
action energy 93.8
cavalry 93.4
canine 93.2
sculpture 92.4
dog 91.6
portrait 90.9
black and white 90.1
adult 89.8
nature 89.6
pet 89.4
toy 89.1
jump 87.9
strong 87.5

Imagga
created on 2022-06-11

hammerhead 95.8
shark 76.5
fish 22.3
sea 17.2
wildlife 16.9
bird 16.7
ice 12.4
sky 12.1
starfish 12
wings 11.5
wing 11.5
tropical 11.1
love 11
ocean 10.8
feather 10.6
flight 10.6
close 10.3
black 10.2
closeup 10.1
water 10
body 9.6
marine 9.5
people 9.5
free 9.4
natural 9.4
fly 9.3
statue 9.1
star 9
dog 8.8
wild 8.7
life 8.6
flying 8.5
outdoor 8.4
animals 8.3
sculpture 8.3
sand 8.3
freedom 8.2
art 8
device 8
high 7.8
beak 7.7
summer 7.7
elegant 7.7
stone 7.7
shell 7.6
beach 7.6
brown 7.4
sexy 7.2
gull 7.1

Google
created on 2022-06-11

Microsoft
created on 2022-06-11

statue 93.4
animal 92.9
sculpture 91.6
black and white 86.8
text 76.2
white 63.2

Color Analysis

Feature analysis

Amazon

Horse 78.2%

Categories

Imagga

paintings art 99.2%

Captions

Microsoft
created on 2022-06-11

a close up of an animal 74.3%
close up of an animal 68.8%
an animal with its mouth open 58.3%