Human Generated Data

Title

Kokeshi to Oiran, Toys No. 1

Date

Shōwa period, dated 1957

People

Artist: Hatsuyama Shigeru, Japanese 1897 - 1973

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of C. Adrian Rübel, 1978.199

Human Generated Data

Title

Kokeshi to Oiran, Toys No. 1

People

Artist: Hatsuyama Shigeru, Japanese 1897 - 1973

Date

Shōwa period, dated 1957

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of C. Adrian Rübel, 1978.199

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Art 91.3
Person 77.1
Human 77.1
Painting 64.5
Modern Art 57.3

Clarifai
created on 2019-07-06

art 99.6
illustration 99.3
retro 99.1
traditional 98.8
woman 97.2
old 97
painting 96.9
man 94.1
visuals 92.8
watercolor painting 92.2
character 91.7
artistic 91.2
people 91.2
culture 88.5
vintage 86.9
wear 86.2
veil 86.2
funny 85.6
face 83.1
texture 82.9

Imagga
created on 2019-07-06

footwear 69.6
shoe 53.2
clothing 39.2
sock 33.9
sandal 33.4
covering 30.1
hosiery 27.5
shoes 24
foot 22.8
fashion 18.1
lace 16.2
black 15.6
feet 15.5
wear 15.3
leg 14.4
leather 14.2
style 14.1
sport 12.3
design 12.1
casual 11.9
boots 11.7
close 11.4
old 11.1
man 10.7
male 10.6
body 10.4
pair 10.4
men 10.3
sports 10.2
leisure 10
new 9.7
consumer goods 9.5
walking 9.5
legs 9.4
classic 9.3
decoration 9.2
pink 9.2
summer 9
object 8.8
closeup 8.7
model 8.5
dress 8.1
detail 8
sole 7.9
skin 7.8
heels 7.8
people 7.8
used 7.7
running 7.7
boot 7.6
worn 7.6
painted 7.6
beach 7.6
healthy 7.6
relaxation 7.5
human 7.5
one 7.5
fastener 7.3
business 7.3
sexy 7.2
slide fastener 7.2
face 7.1

Google
created on 2019-07-06

Microsoft
created on 2019-07-06

drawing 99.8
child art 98.5
text 97.6
sketch 97.5
book 96.7
art 96.3
cartoon 93.6
human face 89.6
person 56.4
acrylic 55.1
painting 16.2

Color Analysis

Feature analysis

Amazon

Person 77.1%
Painting 64.5%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2019-07-06

a close up of a book 29.8%