Human Generated Data

Title

Madonna and Child

Date

17th century

People

Artist: Nicolas de Poilly, French 1626 - 1690

Artist after: Jacques Stella, French 1596 - 1657

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R2474

Human Generated Data

Title

Madonna and Child

People

Artist: Nicolas de Poilly, French 1626 - 1690

Artist after: Jacques Stella, French 1596 - 1657

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R2474

Machine Generated Data

Tags

Amazon
created on 2019-07-30

Human 98.8
Person 98.8
Art 97.2
Painting 95.3

Clarifai
created on 2019-07-30

art 99.5
people 99
portrait 96.5
religion 96.3
saint 96.3
book 95.7
one 95.4
illustration 95.3
baby 95
church 94
god 92.9
holy 91.3
Renaissance 90.8
man 90.2
adult 89.5
image 89.2
interior 89.2
painting 88.7
antique 87
print 86.9

Imagga
created on 2019-07-30

device 67.5
corbel 63.7
knocker 57.9
bracket 51
stucco 46.9
money 38.3
support 37.9
dollar 37.1
currency 35.9
cash 34.8
finance 30.4
sculpture 28.2
paper 26.7
bill 25.7
wealth 25.1
financial 25
business 24.3
banking 23
close 21.7
hundred 21.3
dollars 21.3
architecture 21.1
bank 20.6
old 20.2
pay 20.1
art 20.1
us 19.3
statue 18.3
ancient 18.2
bills 17.5
banknote 17.5
one 17.2
detail 16.9
savings 16.8
rich 16.8
loan 15.3
exchange 15.3
franklin 14.8
capital 14.2
carving 13.2
church 12.9
building 12.7
stone 12.7
closeup 12.1
antique 12.1
banknotes 11.7
history 11.6
finances 11.6
culture 11.1
historic 11
funds 10.8
concepts 10.7
face 10.7
payment 10.6
details 10.4
monument 10.3
city 10
marble 9.9
states 9.7
pattern 9.6
god 9.6
symbol 9.4
historical 9.4
decoration 9.4
commerce 9.3
note 9.2
investment 9.2
religion 9
success 8.9
president 8.8
debt 8.7
price 8.6
profit 8.6
number 8.4
economy 8.3
style 8.2
landmark 8.1
buck 7.9
wages 7.8
paying 7.8
notes 7.7
design 7.3

Google
created on 2019-07-30

Microsoft
created on 2019-07-30

text 99.9
human face 96.4
drawing 95.4
person 95.1
book 94
sketch 92.9
painting 92.3
clothing 82.4
art 76.2
museum 67.6
baby 64
woman 60.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-27
Gender Female, 90.9%
Angry 2.9%
Happy 4.9%
Surprised 2.1%
Disgusted 1.9%
Calm 76.2%
Sad 9.5%
Confused 2.5%

AWS Rekognition

Age 4-7
Gender Female, 72.2%
Disgusted 6.9%
Sad 13.2%
Calm 13.2%
Happy 12.2%
Surprised 6%
Angry 36.3%
Confused 12.2%

Microsoft Cognitive Services

Age 25
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 95.3%

Categories

Imagga

paintings art 99.7%

Captions

Text analysis

Amazon

ROSE
LA
A
VIErGE
VIErGE A LA ROSE Hhe imreripfini 141 erere
erere
afler
Hhe
imreripfini
141

Google

VIERGE A LA ROSE ofler He innfin
VIERGE
A
LA
ROSE
ofler
He
innfin