Human Generated Data

Title

Madonna and Child

Date

16th century

People

Artist: Andrea Andreani, Italian 1558/59 - 1629

Artist after: Francesco Vanni, Italian c. 1563 - 1610

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Horace M. Swope, Class of 1905, M9745

Human Generated Data

Title

Madonna and Child

People

Artist: Andrea Andreani, Italian 1558/59 - 1629

Artist after: Francesco Vanni, Italian c. 1563 - 1610

Date

16th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Horace M. Swope, Class of 1905, M9745

Machine Generated Data

Tags

Amazon
created on 2019-11-05

Art 92.6
Painting 91.1
Human 82.3
Person 82.3
Archangel 59.2
Angel 59.2

Clarifai
created on 2019-11-05

people 99.9
art 99.8
print 99.8
illustration 99.7
engraving 99.4
painting 98.4
portrait 96.1
Renaissance 95.6
one 95.5
adult 95.3
man 95.2
book 94.7
religion 93.4
saint 91.9
cape 91
leader 90.1
visuals 89
etching 88.8
veil 88.2
baroque 87.6

Imagga
created on 2019-11-05

sketch 74.6
drawing 54
representation 45.7
religion 38.6
comic book 37.9
statue 33.6
art 32.5
sculpture 31.1
ancient 28.6
temple 28.5
culture 27.4
architecture 24.5
religious 24.4
god 23
church 21.3
history 18.8
decoration 18.5
golden 18.1
holy 17.4
face 17.1
antique 16.5
old 16
travel 15.5
stone 15.2
print media 15.1
monument 15
money 14.5
mosaic 14.4
spirituality 14.4
spiritual 14.4
detail 13.7
carved 12.7
currency 12.6
vintage 12.4
tourism 12.4
gold 12.3
east 12.2
carving 12.1
famous 12.1
cash 11.9
traditional 11.7
meditation 11.5
faith 11.5
figure 11.5
historic 11
design 10.9
oriental 10.4
portrait 10.4
icon 10.3
close 10.3
dollar 10.2
tourist 10
museum 9.9
worship 9.7
building 9.5
historical 9.4
mask 9.4
book jacket 9.3
banking 9.2
decorative 9.2
masterpiece 8.9
closeup 8.8
man 8.7
bill 8.6
city 8.3
symbol 8.1
bible 7.8
century 7.8
banknote 7.8
pray 7.8
prayer 7.7
heritage 7.7
finance 7.6
head 7.6
one 7.5
style 7.4
tradition 7.4
jacket 7.2

Google
created on 2019-11-05

Microsoft
created on 2019-11-05

text 100
book 100
drawing 99.3
sketch 99.2
illustration 87.7
painting 86.4
person 83.7
human face 80.2
cartoon 72
art 68.2
engraving 63.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-40
Gender Female, 95.3%
Fear 1.4%
Happy 28.8%
Angry 0.3%
Sad 0.7%
Confused 0.4%
Calm 64.9%
Disgusted 0.5%
Surprised 3%

AWS Rekognition

Age 22-34
Gender Female, 89.7%
Sad 4%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%
Calm 95.4%
Surprised 0.2%
Fear 0.1%

Feature analysis

Amazon

Painting 91.1%
Person 82.3%

Categories

Imagga

paintings art 98.8%
pets animals 1.1%

Captions

Microsoft
created on 2019-11-05

a close up of a book 70.7%
close up of a book 65.7%
a hand holding a book 65.6%