Human Generated Data

Title

Standing Fudō Myōō with Sword and Rosary

Date

Muromachi period, 1392-1568 or later

People

-

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Edward W. Forbes and Paul J. Sachs, 1932.26

Human Generated Data

Title

Standing Fudō Myōō with Sword and Rosary

Date

Muromachi period, 1392-1568 or later

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Edward W. Forbes and Paul J. Sachs, 1932.26

Machine Generated Data

Tags

Amazon
created on 2023-08-30

Art 99.9
Painting 99.9
Person 99.2
Face 94.8
Head 94.8
Drawing 75.9
Car 74
Transportation 74
Vehicle 74
Text 63.5
Archaeology 57.8
Home Decor 56.5
Accessories 55.2
Ornament 55.2
Tapestry 55.2

Clarifai
created on 2023-10-29

illustration 99.1
people 99
art 97.6
chalk out 97.4
adult 96
print 95.4
man 94.9
retro 94.5
woodcut 93.5
ancient 92.9
vintage 91.8
old 90.9
design 90
one 90
book bindings 88.3
vector 88.1
visuals 88
religion 87.7
antique 84.7
decoration 83.8

Imagga
created on 2018-12-17

sketch 100
drawing 100
representation 94.8
vintage 33.9
grunge 29
old 27.2
retro 27.1
antique 26.9
ancient 26
art 25.4
paper 23.5
texture 21.6
pattern 19.8
design 16.3
frame 15.8
decoration 14.7
graffito 14.1
aged 13.6
dirty 13.6
map 12.9
wallpaper 12.3
money 11.9
black 11.4
close 11.4
travel 11.3
floral 11.1
finance 11
painting 10.9
sepia 10.7
damaged 10.5
decorative 10
stamp 9.9
currency 9.9
history 9.9
business 9.7
geography 9.6
old fashioned 9.5
grungy 9.5
graphic 9.5
plan 9.5
note 9.2
element 9.1
paint 9.1
atlas 9
style 8.9
world 8.9
ink 8.7
page 8.4
silhouette 8.3
artwork 8.2
ornate 8.2
gold 8.2
global 8.2
shape 8.2
border 8.2
material 8
postmark 7.9
postage 7.9
postal 7.9
artistic 7.8
ornament 7.8
torn 7.7
states 7.7
navigation 7.7
flower 7.7
mail 7.7
worn 7.6
scroll 7.6
capital 7.6
canvas 7.6
detail 7.2
color 7.2
religion 7.2
bank 7.2
structure 7.2
financial 7.1
textured 7
leaf 7

Google
created on 2018-12-17

Microsoft
created on 2018-12-17

text 100
woodcut 100
book 100
sketch 12.8
drawing 9
art 6.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 59.4%
Angry 44.1%
Calm 41.5%
Surprised 9.2%
Fear 6.3%
Confused 4%
Sad 2.5%
Disgusted 2%
Happy 1.1%

Feature analysis

Amazon

Person 99.2%
Car 74%

Captions

Microsoft
created on 2018-12-17

a close up of a book 50.2%
close up of a book 42.8%
a book on top of a building 41%

Text analysis

Amazon

79