Human Generated Data

Title

Shirin looks at the portrait of Khusraw (text recto; painting verso of folio 46), painting from a manuscript of the Khamsa by Nizam

Date

1489

People
Classification

Manuscripts

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Loan from A. Soudavar in memory of his mother Ezzat-Malek Soudavar, 9.2015.46

Human Generated Data

Title

Shirin looks at the portrait of Khusraw (text recto; painting verso of folio 46), painting from a manuscript of the Khamsa by Nizam

People
Date

1489

Classification

Manuscripts

Machine Generated Data

Tags

Amazon
created on 2019-12-04

Text 99.9
Book 99.7
Person 93.6
Human 93.6
Calendar 88
Person 87.5
Person 83.5
Rug 74.7
Person 68.2
Page 56.6

Clarifai
created on 2019-12-04

paper 98.7
retro 98.4
old 98.2
art 97.9
antique 97.8
manuscript 97.4
vintage 97.2
illustration 95.8
ancient 95.3
book bindings 94.6
painting 94.3
page 94.3
document 94.3
dirty 92.8
wear 92.4
text 91.6
picture frame 90.5
texture 89.4
parchment 89.3
desktop 89

Imagga
created on 2019-12-04

vintage 53.9
old 46.8
retro 36.9
paper 34.6
envelope 31.2
binding 30.9
grunge 29
ancient 28.6
letter 28.5
stamp 27.6
texture 27.1
antique 25.4
aged 24.5
mail 24
bookmark 22.5
postage 21.7
postmark 20.7
book jacket 20
postal 17.7
jacket 17.5
rough 17.3
art 17
dirty 16.3
wallpaper 16.1
philately 15.8
cover 15.8
empty 15.5
post 14.3
canvas 14.2
book 14
page 13.9
map 13.8
border 13.6
frame 13.3
sheet 13.2
design 13.1
container 12.7
cardboard 12.5
blank 12
note 12
decoration 12
board 11.9
circa 11.9
wrapping 11.8
covering 11.8
history 11.7
card 11.5
backgrounds 11.4
pattern 11
currency 10.8
aging 10.6
worn 10.5
document 10.2
cash 10.1
global 10
pages 9.8
country 9.7
torn 9.7
parchment 9.6
brown 9.6
insulating material 9.5
grungy 9.5
symbol 9.4
money 9.4
message 9.1
world 8.9
printed 8.9
ragged 8.8
faded 8.8
stained 8.7
used 8.7
international 8.6
damaged 8.6
comic book 8.5
finance 8.5
collection 8.1
material 8
close 8
graffito 7.9
correspondence 7.8
burnt 7.8
geography 7.7
wall 7.7
culture 7.7
sign 7.5
national 7.3
paint 7.3
bank 7.2
building material 7.2
textured 7

Google
created on 2019-12-04

Painting 89.1
Text 89
Art 82
Miniature 69.3
Illustration 69.1
Picture frame 51.3
Artwork 51.1

Microsoft
created on 2019-12-04

cartoon 96.4
text 95.9
drawing 94.1
person 91.4
painting 90.2
clothing 71.3
child art 55.9
binding 53.6

Face analysis

Amazon

AWS Rekognition

Age 36-52
Gender Female, 51.8%
Sad 51.9%
Calm 47.9%
Happy 45%
Confused 45%
Angry 45%
Surprised 45%
Fear 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Female, 53.1%
Sad 45.1%
Fear 45%
Calm 54%
Happy 45%
Disgusted 45%
Angry 45.4%
Confused 45%
Surprised 45.4%

AWS Rekognition

Age 18-30
Gender Female, 54.4%
Angry 45.2%
Calm 52.7%
Fear 45.2%
Sad 45.3%
Disgusted 45.1%
Happy 46.1%
Confused 45.1%
Surprised 45.3%

AWS Rekognition

Age 28-44
Gender Male, 51.8%
Confused 45.1%
Angry 45.5%
Calm 52.4%
Sad 45.1%
Fear 45.2%
Surprised 46.3%
Happy 45.5%
Disgusted 45.1%

Feature analysis

Amazon

Book 99.7%
Person 93.6%
Rug 74.7%

Captions

Microsoft

a sign on the side of a building 55.8%
a close up of a sign 55.7%
a sign on a wall 55.6%

Text analysis

Amazon

orii
3b

Google