Human Generated Data

Title

Rocks

Date

Late Momoyama to Early Edo period, early 17th century

People

Artist: Shōkadō Shōjō, Japanese 1584 - 1639

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Paul Bernat, 1961.98.C

Human Generated Data

Title

Rocks

People

Artist: Shōkadō Shōjō, Japanese 1584 - 1639

Date

Late Momoyama to Early Edo period, early 17th century

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2020-04-23

Art 97.5
Painting 85.1
Human 66.2
Person 66.2
Drawing 62.8
Sketch 59.5

Clarifai
created on 2020-04-23

sand 98.4
beach 97.5
mammal 97.5
people 97.5
cavalry 95
no person 94.5
one 92.6
seashore 90.7
art 90.3
wear 87
water 86.7
cattle 85.6
dirty 84.6
action 83.7
paper 82.9
crustacean 82.3
fish 79.9
war 79.3
desert 79.2
soil 79.2

Imagga
created on 2020-04-23

sand 76.8
beach 52.3
sea 38.6
ocean 37.5
landscape 33.5
ship 31.9
travel 29.6
water 29.4
gymnosperm 27.1
vacation 24.6
sky 24.5
vessel 23.9
soil 23.7
coast 23.4
shore 23.4
wreck 22.6
summer 21.9
spermatophyte 20.3
scenic 20.2
desert 20
rocks 19.8
mountains 18.5
mountain 18.1
earth 18.1
sandy 17.5
shipwreck 17.3
rock 16.5
island 16.5
outdoors 16.4
wave 16.4
sun 16.2
clouds 16.1
tourism 15.7
dune 15.5
craft 15.3
coastline 15.1
scenery 14.4
vascular plant 13.9
tropical 13.6
holiday 13.6
wilderness 12.3
stone 11.8
sunny 10.3
waves 10.2
hot 10.1
vehicle 9.9
park 9.9
wild 9.6
paradise 9.4
natural 9.4
arthropod 9.3
dry 9.3
relax 9.3
outdoor 9.2
national 9.1
horseshoe crab 8.9
footprints 8.9
hills 8.7
day 8.6
golden 8.6
land 8.6
outside 8.6
plant 8.3
lake 8.2
message 8.2
drought 7.9
cloud 7.8
tree 7.7
texture 7.7
stones 7.6
bay 7.6
relaxation 7.5
peaceful 7.3
geological formation 7.3
tourist 7.3
color 7.2
crater 7.2
river 7.1

Google
created on 2020-04-23

Drawing 73.8
Visual arts 64.9
Art 62.5
Illustration 62.3
Sketch 57.9

Microsoft
created on 2020-04-23

drawing 99.1
outdoor 98.1
sketch 97.6
text 93.1
military vehicle 56.9
painting 55.1

Feature analysis

Amazon

Painting 85.1%
Person 66.2%

Captions

Microsoft

a close up of a desert 69.2%
a person in a desert 46.4%
close up of a desert 46.3%