Human Generated Data

Title

Suma, Illustration to Chapter 12 of the "Tale of Genji" (Genji monogatari)

Date

17th century

People

-

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Mrs. John T. Linzee, 1931.250

Human Generated Data

Title

Suma, Illustration to Chapter 12 of the "Tale of Genji" (Genji monogatari)

Date

17th century

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Mrs. John T. Linzee, 1931.250

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 98.3
Person 98.3
Building 92.8
Architecture 91.5
Advertisement 74.3
Poster 74.3
Art 72.1
Painting 72.1
Column 68
Pillar 68
Indoors 66.8
Interior Design 66.8
Temple 63.5
Worship 59.1
Shrine 59.1
Person 58.1
Power Plant 56.5
Ruins 56.1

Clarifai
created on 2020-04-24

people 99.7
print 99.6
illustration 99.4
art 98.6
painting 97.4
group 96.2
engraving 96
adult 95.7
cavalry 94.9
two 93.5
man 93.1
one 92.7
home 92.2
vehicle 91.3
war 85.4
military 85.1
veil 84.5
no person 84.1
vintage 82.5
old 82

Imagga
created on 2020-04-24

book jacket 95.2
jacket 75.1
wrapping 56.3
grunge 46
old 44.6
vintage 43.9
covering 42.8
texture 42.4
memorial 34
retro 32
aged 31.7
gravestone 30.7
antique 29.4
frame 28.4
structure 27.4
border 23.5
grungy 22.8
paper 22.8
wall 21.5
ancient 20.8
art 20.2
rough 20.1
stone 19.9
dirty 19.9
textured 19.3
design 17.5
damaged 17.2
material 17
black 16.8
text 16.6
letter 15.6
empty 15.5
blank 14.6
rusty 14.3
screen 14.1
wallpaper 13.8
worn 13.4
weathered 13.3
backdrop 13.2
blackboard 13.1
pattern 13
decoration 12.9
graphic 11.7
space 11.6
film 11.4
board 11.3
scratch 10.7
album 10.7
surface 10.6
brass 10.6
parchment 10.6
backgrounds 10.5
brown 10.3
cover 10.2
chalkboard 9.8
business 9.7
rust 9.6
edge 9.6
canvas 9.5
sheet 9.4
global 9.1
paint 9.1
book 9
chalk 8.9
messy 8.7
stain 8.7
spot 8.6
page 8.4
drawing 8.1
digital 8.1
stamp 7.9
slide 7.8
ragged 7.8
decay 7.7
stained 7.7
card 7.7
dirt 7.6
age 7.6
textures 7.6
decorative 7.5
protective covering 7.5
closeup 7.4
symbol 7.4
grain 7.4
note 7.4
graffito 7.3
detail 7.2

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 99.5
drawing 98.4
sketch 96.3
painting 96.2
child art 62.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Female, 52.9%
Happy 46.3%
Surprised 45.1%
Calm 51.3%
Fear 45.1%
Sad 46.8%
Disgusted 45.1%
Confused 45.1%
Angry 45.3%

Feature analysis

Amazon

Person 98.3%
Poster 74.3%
Painting 72.1%

Categories

Captions

Text analysis

Amazon

1931250

Google

1931, 250
1931,
250