Human Generated Data

Title

Virgin Seated Caressing the Child

Date

1513

People

Artist: Albrecht Dürer, German 1471 - 1528

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1080

Human Generated Data

Title

Virgin Seated Caressing the Child

People

Artist: Albrecht Dürer, German 1471 - 1528

Date

1513

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1080

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Art 95.2
Painting 86.1
Human 78.2
Person 78.2
Person 70.4
Archaeology 56.4

Clarifai
created on 2019-11-10

people 99.8
art 99.6
print 99.3
painting 98.8
illustration 98.8
adult 98
engraving 97.5
religion 96.2
man 95.8
group 94.9
one 92.3
Renaissance 91.3
woman 90.4
god 90
book 89.8
antique 88.3
old 88.3
saint 87.3
veil 86.7
wear 86.1

Imagga
created on 2019-11-10

sketch 100
drawing 100
representation 89.8
art 29.8
ancient 27.7
old 24.4
sculpture 23.3
history 22.4
architecture 21.2
religion 20.6
statue 20.3
antique 17.8
stone 17.3
currency 17.1
money 17
cash 16.5
carving 15.8
vintage 15.7
paper 15.7
banking 15.6
temple 15.4
monument 15
culture 14.5
historic 13.8
god 13.4
figure 13
travel 12.7
close 12.6
religious 12.2
church 12
bank 11.7
decoration 11.6
spirituality 11.5
retro 11.5
dollar 11.1
finance 11
face 10.7
bill 10.5
detail 10.5
historical 10.4
golden 10.3
famous 10.2
symbol 10.1
financial 9.8
banknotes 9.8
texture 9.7
artistic 9.6
building 9.5
man 9.4
economy 9.3
design 9.2
artwork 9.2
landmark 9
wealth 9
one 9
carved 8.8
holy 8.7
exchange 8.6
grunge 8.5
black 8.4
city 8.3
tourism 8.3
aged 8.1
business 7.9
museum 7.8
marble 7.8
spiritual 7.7
pattern 7.5
column 7.5
savings 7.5
closeup 7.4
mosaic 7.2
icon 7.1

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

text 100
book 99.9
sketch 99.4
drawing 99.4
illustration 82.1
art 79.4
cartoon 77
painting 66.1
person 58.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-38
Gender Female, 81.9%
Disgusted 0.5%
Confused 0.3%
Angry 0.4%
Surprised 0.3%
Fear 0.4%
Calm 90.8%
Happy 1.8%
Sad 5.7%

Feature analysis

Amazon

Painting 86.1%
Person 78.2%

Categories

Imagga

paintings art 97.6%
pets animals 1.9%

Captions

Microsoft
created on 2019-11-10

a close up of a book 54.5%
close up of a book 48.7%
a hand holding a book 48.6%

Text analysis

Amazon

DI
ISI3

Google

IS13
IS13