Human Generated Data

Title

Illustrated Manuscript of Layla and Majnun by Nizami

Date

1470

People

-

Classification

Manuscripts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Loan from A. Soudavar in memory of his mother Ezzat-Malek Soudavar, 7.2015

Human Generated Data

Title

Illustrated Manuscript of Layla and Majnun by Nizami

Date

1470

Classification

Manuscripts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Loan from A. Soudavar in memory of his mother Ezzat-Malek Soudavar, 7.2015

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Human 92.4
Person 92.4
Person 87.3
Person 82.4
Art 79.7
Person 79
Person 75.2
Person 75.2
Rug 70.6
Flooring 67.9

Clarifai
created on 2018-02-10

old 98.5
retro 98.2
art 97.9
antique 97.8
vintage 97.2
ancient 95.7
painting 95.3
decoration 95
architecture 94.9
wall 93.6
picture frame 91.9
paper 91.2
design 90.7
illustration 89.9
building 89.6
artistic 89.5
symbol 88.9
wear 88.5
dirty 88.4
desktop 87.9

Imagga
created on 2018-02-10

old 47.4
vintage 46.4
tile 42.8
rug 34.7
texture 32
antique 31.7
grunge 31.5
art 30.3
ancient 30.3
furnishing 25.5
retro 24.6
paper 23.5
frame 23.3
religion 19.7
wall 19.7
covering 19.7
decoration 19.3
pattern 19.2
aged 19
mosaic 18.9
binding 18.7
design 18.6
letter 15.6
culture 15.4
rusty 15.3
door 15.2
postage 14.8
blank 14.6
border 14.5
gold 14
wallpaper 13.8
empty 13.8
stamp 13.6
parchment 13.4
mail 13.4
worn 13.4
post 13.4
brown 13.3
dirty 12.7
entrance 12.6
decorative 12.5
history 12.5
building 12
ornate 11.9
architecture 11.7
religious 11.2
style 11.1
bookmark 10.9
carving 10.9
postmark 10.8
traditional 10.8
postal 10.8
backdrop 10.7
ornament 10.4
close 10.3
artwork 10.1
global 10
wood 10
material 9.8
decor 9.7
museum 9.7
temple 9.5
ornamental 9.5
transducer 9.5
page 9.3
cover 9.3
church 9.3
travel 9.2
paint 9.1
doorway 8.9
textured 8.8
symbol 8.8
delivery 8.8
torn 8.7
sculpture 8.7
used 8.6
obsolete 8.6
painted 8.6
grungy 8.5
document 8.4
sill 8.3
historic 8.3
rough 8.2
backgrounds 8.1
tray 8.1
board 8.1
metal 8.1
stamps 7.9
shabby 7.8
carved 7.8
ragged 7.8
letters 7.7
stained 7.7
cardboard 7.7
rustic 7.7
weathered 7.6
sheet 7.5
china 7.5
device 7.5
arabesque 7.2
painting 7.2
world 7.1
book 7.1
surface 7.1

Google
created on 2018-02-10

text 87.5
picture frame 74.4
art 71.4
miniature 62.6
history 60.4
artwork 53.7
painting 50.7

Microsoft
created on 2018-02-10

building 99.5
old 81.9
gallery 50.4
picture frame 25.9
painted 22.2
painting 22.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-27
Gender Female, 53.5%
Surprised 46.3%
Sad 45.7%
Disgusted 48.4%
Calm 46.8%
Happy 45.8%
Angry 46.6%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Female, 52.9%
Disgusted 47.6%
Confused 45.2%
Happy 47%
Angry 45.6%
Calm 48%
Surprised 46%
Sad 45.7%

AWS Rekognition

Age 20-38
Gender Female, 50.1%
Surprised 45.3%
Disgusted 45.2%
Confused 45%
Happy 53.6%
Calm 45.7%
Sad 45.1%
Angry 45.1%

AWS Rekognition

Age 20-38
Gender Female, 54.9%
Calm 45.2%
Angry 45.2%
Surprised 45.2%
Sad 45.4%
Disgusted 45.5%
Happy 53.4%
Confused 45.1%

AWS Rekognition

Age 23-38
Gender Female, 53.6%
Surprised 45.4%
Calm 50.8%
Angry 45.3%
Disgusted 45.3%
Happy 46.1%
Confused 45.2%
Sad 47%

AWS Rekognition

Age 48-68
Gender Female, 51.9%
Happy 45.1%
Sad 51.6%
Calm 46.1%
Confused 45.4%
Angry 45.4%
Disgusted 46%
Surprised 45.5%

AWS Rekognition

Age 16-27
Gender Female, 54.2%
Angry 46.1%
Calm 46.6%
Surprised 46%
Disgusted 46.9%
Sad 45.9%
Happy 48.3%
Confused 45.2%

AWS Rekognition

Age 14-25
Gender Female, 50.1%
Happy 45.1%
Angry 45.4%
Disgusted 47.3%
Sad 46%
Surprised 45.4%
Confused 45.4%
Calm 50.3%

Feature analysis

Amazon

Person 92.4%
Rug 70.6%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2018-02-10

a painting on the side of a building 63.3%
a painting on the wall 63.2%
a painting on a wall 63.1%

Text analysis

Amazon

i!
DSGI