Human Generated Data

Title

Virgin Praying

Date

17th century

People

Artist: Fran├žois de Poilly the Elder, French 1622/23 - 1693

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R2463

Human Generated Data

Title

Virgin Praying

People

Artist: Fran├žois de Poilly the Elder, French 1622/23 - 1693

Artist after: Raphael, Italian 1483 - 1520

Date

17th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-07-30

Art 98.6
Painting 98.6
Person 97.3
Human 97.3
Portrait 59.4
Photography 59.4
Face 59.4
Photo 59.4

Clarifai
created on 2019-07-30

people 99.9
one 99.6
adult 99
portrait 98
print 96.4
art 95.3
woman 92
engraving 91
administration 89.3
music 85.2
writer 82.6
leader 80.8
furniture 78.6
indoors 77.3
man 75.1
gown (clothing) 74.8
wear 74.8
veil 74.5
royalty 74.2
sit 73.9

Imagga
created on 2019-07-30

book jacket 82.9
jacket 64.5
wrapping 49
sketch 47.6
money 43.4
currency 39.5
cash 37.6
dollar 36.2
drawing 33.7
covering 33
finance 31.3
representation 29.9
bill 28.6
paper 28.3
wealth 27.8
banking 27.6
bank 26.9
dollars 26.1
financial 25
close 23.4
business 23.1
us 21.2
hundred 20.3
one 19.4
franklin 18.7
art 17.6
bills 17.5
pay 17.3
portrait 16.8
savings 16.8
banknote 16.5
exchange 16.2
rich 14.9
economy 14.8
banknotes 14.7
finances 14.5
church 13.9
closeup 13.5
painter 13.3
statue 13.1
culture 12.8
face 12.8
loan 12.5
sculpture 12.1
investment 11.9
religion 11.7
states 11.6
payment 11.6
profit 11.5
god 11.5
symbol 11.4
ancient 11.2
man 10.8
newspaper 10.7
united 10.5
capital 10.4
mosaic 10.4
creation 10.3
note 10.1
vintage 9.9
economic 9.7
success 9.7
holy 9.6
notes 9.6
commerce 9.3
product 9.3
head 9.2
sign 9
history 8.9
market 8.9
funds 8.8
wages 8.8
old 8.4
concepts 8
icon 7.9
twenty 7.9
president 7.9
architecture 7.8
antique 7.8
museum 7.8
value 7.8
artist 7.7
faith 7.7
religious 7.5
number 7.5
people 7.3
male 7.1

Google
created on 2019-07-30

Microsoft
created on 2019-07-30

text 100
book 99.1
drawing 99
sketch 99
human face 98.2
painting 95.7
person 89.6
portrait 72.3
woman 66.7
clothing 54.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-38
Gender Female, 91.6%
Confused 9.9%
Sad 5.9%
Surprised 6.1%
Angry 4.3%
Calm 61.7%
Disgusted 3.2%
Happy 8.9%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 98.6%
Person 97.3%

Captions

Microsoft

a close up of a person holding a book 34.7%
a person sitting on a book 25.9%
a person holding a book 25.8%

Text analysis

Amazon

Hpally
ompriil egie

Google

Ve prini Regi aly
Ve
prini
Regi
aly