Human Generated Data

Title

Horse

Date

-

People

-

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.93

Human Generated Data

Title

Horse

Classification

Sculpture

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Joseph C. Hoppin, 1925.30.93

Machine Generated Data

Tags

Amazon
created on 2022-06-11

Figurine 99.2
Dinosaur 97.7
Animal 97.7
Reptile 97.7
Sculpture 64.8
Art 64.8

Clarifai
created on 2023-10-29

monochrome 99.4
portrait 98.9
animal 98.4
bird 97.4
one 97.4
no person 97.3
art 97
sculpture 96.4
museum 95.9
wildlife 95.6
model 95.4
nature 95.3
cat 94.7
black and white 93.9
dog 92.3
people 92.1
side view 91.2
nude 90.8
mammal 90.5
grey 90

Imagga
created on 2022-06-11

hammer 37.5
bookend 33.7
support 29.7
equipment 27.9
device 26.9
sports equipment 26.5
metal 18.5
animal 17.5
3d 16.3
object 16.1
tool 14.4
steel 13.3
wildlife 11.6
single 10.7
human 10.5
high 10.4
gull 10.4
close 10.3
bird 10.2
closeup 10.1
wrench 9.9
plug 9.9
hand 9.7
symbol 9.4
studio 9.1
fashion 9
style 8.9
seagull 8.9
spanner 8.8
wing 8.8
wild 8.7
shiny 8.7
work 8.6
flight 8.6
construction 8.6
iron 8.4
power 8.4
person 8.3
one 8.2
brown 8.1
sea 8
silver 8
soaring 7.9
soar 7.9
black 7.8
feather 7.7
sky 7.7
marine 7.6
tools 7.6
flying 7.6
character 7.5
free 7.5
art 7.5
ocean 7.5
fly 7.5
air 7.4
freedom 7.3

Google
created on 2022-06-11

Head 97.4
Table 90.1
Horse 89
Human body 88.6
Toy 87.3
Wood 84.6
Sculpture 83
Terrestrial animal 78.5
Working animal 76.8
Art 75.7
Snout 75.4
Tail 73.8
Font 73.7
Chair 71.9
Animal figure 71.5
Rectangle 70.7
Metal 66.2
Monochrome photography 65.5
Statue 64.9
Fashion accessory 63.4

Microsoft
created on 2022-06-11

statue 83.7
animal 82.7
art 77.7
black and white 65.5
sculpture 58

Color Analysis

Feature analysis

Amazon

Dinosaur 97.7%

Categories

Captions

Microsoft
created on 2022-06-11

a body of water 28.1%