Human Generated Data

Title

Women Near a Pool

Date

c. 1859-1860

People

Artist: Rodolphe Bresdin, French 1822 - 1885

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.773

Human Generated Data

Title

Women Near a Pool

People

Artist: Rodolphe Bresdin, French 1822 - 1885

Date

c. 1859-1860

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.773

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Human 98.4
Person 98.4
Painting 98
Art 98
Person 97
Person 89
Person 81.7
Person 54.8

Clarifai
created on 2020-04-25

print 99.9
art 99.7
illustration 99.6
people 99.5
engraving 99.1
group 98.9
etching 98.1
painting 97.8
visuals 97
adult 95.8
man 95.7
vintage 94.4
sepia pigment 93.3
paper 93
tree 92.1
antique 91.6
monochrome 89.4
woman 88.4
old 88.2
river 87.9

Imagga
created on 2020-04-25

grunge 40
old 38.3
vintage 33.9
texture 32.7
art 30.5
retro 27
antique 26.3
pattern 25.3
ancient 25.1
sketch 25
drawing 24.7
aged 22.6
dirty 22.6
paper 20.5
frame 20.1
artistic 20
structure 19.6
design 19.2
grungy 19
wallpaper 17.6
stone 17.3
paint 17.2
border 17.2
wall 16.5
backdrop 16.5
stucco 16.4
architecture 16.3
textured 15.8
material 15.2
representation 15.2
rough 14.6
column 14.4
sculpture 14.1
graphic 13.9
style 13.4
blank 12.9
decoration 12.8
history 12.5
worn 12.4
text 12.2
floral 11.9
temple 11.8
aging 11.5
damaged 11.4
ornament 11.2
carving 11.1
flower 10.8
religion 10.8
stained 10.6
color 10.6
detail 10.5
weathered 10.4
brown 10.3
monument 10.3
black 10.2
memorial 10.2
historic 10.1
artwork 10.1
effect 10
cemetery 9.7
surface 9.7
torn 9.7
parchment 9.6
canvas 9.5
sheet 9.4
space 9.3
gravestone 9
mottled 8.8
decay 8.7
stain 8.6
empty 8.6
leaf 8.6
famous 8.4
page 8.4
decorative 8.4
element 8.3
shape 8.1
carved 7.8
fracture 7.8
faded 7.8
sepia 7.8
culture 7.7
obsolete 7.7
historical 7.5
note 7.4
backgrounds 7.3
holiday 7.2
travel 7

Google
created on 2020-04-25

Painting 88.5
Art 80.6
Illustration 76.4
Tree 68.7
Stock photography 62.1
Artwork 54.3
Drawing 51.1
Picture frame 50.8

Microsoft
created on 2020-04-25

text 99.6
book 97.5
drawing 89.6
window 82.4
engraving 73.1
sketch 68
illustration 66
old 65.9
person 59.3
painting 26.4
stone 3.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Male, 50%
Calm 49.6%
Sad 49.6%
Angry 49.6%
Disgusted 49.6%
Fear 50.1%
Happy 49.5%
Confused 49.5%
Surprised 49.6%

AWS Rekognition

Age 21-33
Gender Female, 50.3%
Surprised 49.8%
Fear 49.5%
Disgusted 49.5%
Happy 49.5%
Confused 49.5%
Sad 49.5%
Calm 50.1%
Angry 49.5%

AWS Rekognition

Age 11-21
Gender Female, 50.4%
Happy 49.5%
Surprised 49.6%
Sad 49.6%
Calm 50%
Fear 49.6%
Angry 49.5%
Disgusted 49.6%
Confused 49.6%

AWS Rekognition

Age 21-33
Gender Male, 50%
Fear 49.5%
Calm 49.6%
Confused 49.5%
Angry 49.6%
Happy 49.5%
Surprised 49.5%
Sad 50.1%
Disgusted 49.6%

Feature analysis

Amazon

Person 98.4%
Painting 98%

Categories