Human Generated Data

Title

Seated Lady Holding a Fan

Date

c. 1861

People

Artist: Jules de Goncourt, French 1830 - 1870

Artist after: François Boucher, French 1703 - 1770

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Elizabeth Mongan in memory of Paul J. Sachs, M22342

Human Generated Data

Title

Seated Lady Holding a Fan

People

Artist: Jules de Goncourt, French 1830 - 1870

Artist after: François Boucher, French 1703 - 1770

Date

c. 1861

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Elizabeth Mongan in memory of Paul J. Sachs, M22342

Machine Generated Data

Tags

Amazon
created on 2019-11-08

Art 97.3
Painting 84.6
Person 80.3
Human 80.3

Clarifai
created on 2019-11-08

print 99.8
people 99.7
art 99.6
illustration 98.8
engraving 98.7
adult 97.8
one 97.4
wear 96.7
painting 96.3
portrait 95.3
man 92.1
antique 91.9
etching 91.4
veil 91
vintage 90.4
lithograph 90.3
retro 86.5
old 85.5
child 83.1
cape 81.6

Imagga
created on 2019-11-08

representation 100
sketch 100
drawing 100
art 32.7
ancient 23.4
statue 22.2
religion 20.6
sculpture 20.3
old 20.2
detail 19.3
antique 18.2
religious 16.9
design 14.5
decoration 14.5
pattern 14.4
god 13.4
architecture 13.3
vintage 13.2
culture 12.8
symbol 12.8
history 12.5
money 11.9
black 11.4
close 11.4
artistic 11.3
cash 11
carving 10.9
painting 10.8
currency 10.8
shape 10.4
style 10.4
ornament 10.4
paper 10.2
artwork 10.1
figure 10
texture 9.7
temple 9.5
man 9.4
stone 9.3
banking 9.2
retro 9
bank 9
color 8.9
textured 8.8
closeup 8.8
catholic 8.8
holy 8.7
spiritual 8.6
grunge 8.5
church 8.3
human 8.3
backgrounds 8.1
graphic 8
saint 7.7
spirituality 7.7
decorative 7.5
traditional 7.5
famous 7.5
creativity 7.4
dollar 7.4
gold 7.4
historic 7.3
effect 7.3
people 7.3
face 7.1
travel 7

Google
created on 2019-11-08

Microsoft
created on 2019-11-08

text 99.9
sketch 99.7
drawing 99.6
book 99.5
art 90.9
painting 79.8
illustration 64.9
cartoon 64.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 20-32
Gender Female, 97.7%
Surprised 1.1%
Calm 89.8%
Angry 0.8%
Sad 0.7%
Disgusted 0.5%
Fear 0.2%
Confused 0.4%
Happy 6.5%

Microsoft Cognitive Services

Age 25
Gender Female

Feature analysis

Amazon

Painting 84.6%
Person 80.3%

Categories

Imagga

paintings art 99.8%

Captions

Microsoft
created on 2019-11-08

a close up of a book 43.5%
close up of a book 37.9%
a hand holding a book 37.8%

Text analysis

Amazon

ulis
23

Google

230
230