Human Generated Data

Title

Krishna and Radha After Exchanging Clothes (recto); Cityscape (verso)

Date

c. 1760

People

Artist: Nihal Chand, Indian

Classification

Drawings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, The Stuart Cary Welch Collection, Gift of Edith I. Welch in memory of Stuart Cary Welch, 2009.202.20

Human Generated Data

Title

Krishna and Radha After Exchanging Clothes (recto); Cityscape (verso)

People

Artist: Nihal Chand, Indian

Date

c. 1760

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Text 98.1
Human 78.7
Person 78.7
Paper 75.9
Art 68.9
Drawing 65.9
Wall 63.8
Person 61.7
Painting 61.4
Letter 57.3
Page 56.8
Handwriting 56.6

Clarifai
created on 2020-04-25

retro 99.7
antique 99.4
wear 99.3
paper 99.2
old 99.2
parchment 99.1
vintage 98.8
dirty 98.5
ancient 98.4
rough 98
texture 97.9
page 97.8
manuscript 97.7
sepia pigment 97.3
sooty 97.2
art 96.8
soil 96.1
desktop 95.9
obsolete 94.9
artistic 94.9

Imagga
created on 2020-04-25

grunge 65.8
antique 62.4
old 61.5
vintage 59.7
aged 56.3
texture 55.7
paper 53.6
stucco 48.4
ancient 46.8
wallpaper 39.9
dirty 39.9
parchment 39.5
material 39.4
aging 36.5
document 36.3
retro 36.2
blank 34.4
damaged 33.5
decay 32.9
canvas 32.3
empty 31
worn 30.6
rough 30.2
page 28.9
pattern 28.8
journal 28
stained 28
stain 28
cardboard 27.9
frame 27.5
textured 27.3
text 27.1
crumpled 26.3
torn 26.2
grungy 25.7
border 25.4
brown 25.1
decorative 24.3
envelope 24.2
letter 23.9
space 23.3
wall 23.3
graffito 22.9
grain 22.2
backgrounds 22
sheet 21.7
manuscript 21.6
burnt 21.4
weathered 21
textures 20.9
grime 20.6
book 20.3
artistic 20
old fashioned 20
decoration 19.9
stains 19.5
map 19
surface 18.6
fracture 18.5
age 18.2
art 17.7
design 16.9
shabby 16.7
historic 16.5
messy 16.5
paint 16.3
obsolete 16.3
structure 15.7
pages 15.7
rusty 15.3
tattered 14.8
scratched 14.7
ragged 14.7
crack 14.6
distressed 13.8
rust 13.5
color 13.4
backdrop 13.2
detail 12.9
note 12.9
container 12.8
mottled 12.7
fiber 12.5
dirt 12.4
representation 12.4
yellow 12
floral 12
beige 11.6
flower 11.6
poster 11.4
cover 11.1
history 10.8
detailed 10.6
cement 10.2
board 10.2
correspondence 9.8
stationery 9.8
sepia 9.7
spotted 9.7
spot 9.6
scroll 9.6
graphic 9.5
card 9.3
artwork 9.2
brittle 8.9
materials 8.8
uneven 8.8
faded 8.8
layers 8.7
ornament 8.6
leaf 8.6
effect 8.2
abrasion 7.9
pasteboard 7.9
scrolls 7.9
letterhead 7.9
burned 7.9
crease 7.9
scratch 7.8
burning 7.8
tracery 7.3
creative 7.1

Google
created on 2020-04-25

Text 88.1
Paper 79.3
Drawing 65.2
Beige 63.7
Paper product 63.3
Art 50.2

Microsoft
created on 2020-04-25

drawing 98.8
sketch 96.7
handwriting 95.2
text 93
child art 82.2

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Male, 50.3%
Fear 49.5%
Surprised 49.5%
Sad 49.5%
Confused 49.5%
Calm 50.3%
Happy 49.5%
Angry 49.6%
Disgusted 49.5%

AWS Rekognition

Age 11-21
Gender Female, 50.2%
Confused 49.5%
Fear 49.5%
Angry 49.6%
Happy 49.5%
Calm 50.2%
Surprised 49.6%
Sad 49.5%
Disgusted 49.5%

AWS Rekognition

Age 22-34
Gender Female, 52.5%
Happy 45.2%
Calm 47.4%
Angry 51.5%
Disgusted 45%
Fear 45.1%
Sad 45.6%
Confused 45.1%
Surprised 45.1%

AWS Rekognition

Age 19-31
Gender Female, 50.3%
Surprised 49.5%
Confused 49.5%
Sad 49.5%
Disgusted 49.5%
Happy 49.5%
Angry 50.4%
Calm 49.5%
Fear 49.5%

AWS Rekognition

Age 11-21
Gender Female, 50%
Fear 49.7%
Calm 49.8%
Happy 49.5%
Sad 49.6%
Angry 49.6%
Disgusted 49.6%
Surprised 49.7%
Confused 49.5%

AWS Rekognition

Age 3-11
Gender Female, 50.4%
Surprised 49.5%
Confused 49.5%
Happy 49.6%
Angry 49.8%
Calm 49.8%
Fear 49.5%
Disgusted 49.5%
Sad 49.6%

AWS Rekognition

Age 14-26
Gender Female, 50.3%
Fear 49.5%
Happy 49.6%
Sad 49.5%
Angry 50.2%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Calm 49.7%

Feature analysis

Amazon

Person 78.7%
Painting 61.4%

Captions

Microsoft

a close up of an old building 46.4%
a close up of a building 46.3%
close up of an old building 38.7%

Text analysis

Amazon

yom

Google