Human Generated Data

Title

Laura, Madonna (?)

Date

1819

People

Artist: Raphael Morghen, Italian 1758 - 1833

Artist after: Lippo Memmi, Italian active 1317-c. 1350

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2798

Human Generated Data

Title

Laura, Madonna (?)

People

Artist: Raphael Morghen, Italian 1758 - 1833

Artist after: Lippo Memmi, Italian active 1317-c. 1350

Date

1819

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2798

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99
Human 99
Art 96.2
Painting 91
Drawing 89.7
Sketch 60.2

Clarifai
created on 2019-11-16

people 99.7
portrait 99.5
one 98.5
adult 98.3
art 98.1
wear 97.1
woman 97.1
retro 95.3
painting 93.8
girl 92.5
music 87.7
child 87
facial expression 85.3
fashion 84.4
furniture 83.4
bill 82.7
man 81.3
model 81.1
vintage 81
indoors 80.5

Imagga
created on 2019-11-16

mug shot 98.4
photograph 76.6
representation 66.5
creation 43.1
money 23.8
paper 23.6
envelope 23.1
business 21.3
currency 20.6
cash 19.2
bill 19
portrait 18.1
people 17.9
bank 17.3
face 16.3
sign 15.8
banking 15.6
dollar 14.9
blank 14.6
wealth 13.5
person 13.3
card 13
empty 12.9
black 12.8
finance 12.7
container 12.6
close 12.6
adult 12.3
closeup 12.1
man 12.1
note 11.9
attractive 11.9
financial 11.6
savings 11.2
rich 11.2
economy 11.1
message 11
head 10.9
holding 10.7
banknote 10.7
exchange 10.5
old 10.5
expression 10.2
model 10.1
symbol 10.1
investment 10.1
male 10
pretty 9.8
one 9.7
success 9.7
office 9.6
eyes 9.5
cute 9.3
smile 9.3
vintage 9.1
design 9
drawing 9
lady 8.9
sketch 8.9
happy 8.8
pay 8.6
loan 8.6
art 8.6
hand 8.4
texture 8.3
human 8.3
board 8.1
women 7.9
textured 7.9
brunette 7.8
queen 7.8
dollars 7.7
pound 7.7
document 7.4
retro 7.4
notebook 7.3
new 7.3
book jacket 7.3
student 7.2
sexy 7.2
market 7.1
copy 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

human face 98.9
drawing 97.5
gallery 97
sketch 94.5
scene 93.7
person 93.3
clothing 92.1
room 91.8
woman 84.9
text 81.1
smile 63.4
painting 55.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 99.3%
Sad 2.5%
Surprised 0.5%
Angry 1.1%
Confused 1.4%
Fear 0.3%
Happy 0.4%
Calm 93.1%
Disgusted 0.7%

Microsoft Cognitive Services

Age 25
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Imagga

paintings art 99.8%

Captions

Text analysis

Amazon

VIDER
VIVA
G1.1
OCCII.C'IE
BNAT'I G1.1 OCCII.C'IE ('IL I.A VIDER VIVA
I.A
BNAT'I
('IL
Inherre
Ay:
Feveswm
Ay: inmetinee Feveswm Alono
inmetinee
Alono
On

Google

BEATI G1.t OCCHI,IUE A VIDER VIVA
BEATI
G1.t
OCCHI,IUE
A
VIDER
VIVA