Human Generated Data

Title

Christ Proclaiming the Law

Date

18th century

People

Artist: Raphael Morghen, Italian 1758 - 1833

Artist: Manuel Esquivel Sotomayor, Spanish 1777-1842

Artist after: Leonardo da Vinci, Italian 1452 - 1519

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R12757

Human Generated Data

Title

Christ Proclaiming the Law

People

Artist: Raphael Morghen, Italian 1758 - 1833

Artist: Manuel Esquivel Sotomayor, Spanish 1777-1842

Artist after: Leonardo da Vinci, Italian 1452 - 1519

Date

18th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-06

Human 99.1
Person 99.1
Art 94.3
Face 88.6
Painting 84.5
Drawing 83.3
Photography 67.3
Portrait 67.3
Photo 67.3
Sketch 64
Female 58.6

Clarifai
created on 2019-11-06

people 99.2
one 98.7
portrait 98.7
adult 97
wear 96.8
woman 96.3
painting 94.3
art 93.1
retro 90.3
indoors 89.6
music 89.5
child 87.3
fashion 84.8
veil 83.7
furniture 83.6
dress 83.3
sit 82.4
girl 81.7
print 81.5
illustration 78.2

Imagga
created on 2019-11-06

mug shot 71.9
photograph 62.8
representation 45.4
creation 34.9
envelope 32.1
paper 30.6
old 30
vintage 29.8
money 23
retro 22.1
antique 20.8
currency 19.8
cash 18.3
aged 18.1
grunge 17.9
bank 17.1
bill 17.1
letter 16.5
ancient 16.4
texture 16
business 15.8
banking 15.6
symbol 15.5
stamp 15.1
mail 14.4
design 14.1
dollar 13.9
book 13.8
book jacket 13.8
blank 13.7
frame 13.6
finance 13.5
financial 13.4
wealth 12.6
post 12.4
card 12.3
savings 12.1
page 12.1
art 11.9
postmark 11.8
postage 11.8
border 11.8
jacket 11.7
sign 11.3
one 11.2
empty 11.2
culture 11.1
message 11
dirty 10.8
container 10.8
office 10.6
close 10.3
pattern 10.3
note 10.1
postal 9.8
dollars 9.7
textured 9.6
black 9.6
pay 9.6
exchange 9.6
damaged 9.5
wall 9.4
economy 9.3
wrapping 9.1
history 8.9
circa 8.9
banknote 8.7
great 8.6
covering 8.5
wallpaper 8.4
document 8.4
product 8.3
investment 8.3
structure 8.2
collection 8.1
printed 7.9
queen 7.8
album 7.8
states 7.7
parchment 7.7
sheet 7.5
rough 7.3
detail 7.2
material 7.1
notebook 7.1
portrait 7.1
market 7.1

Google
created on 2019-11-06

Microsoft
created on 2019-11-06

human face 99
drawing 97.3
gallery 95.2
person 94.5
sketch 90.6
room 88.3
scene 85.2
woman 85
painting 82.8
clothing 77.3
text 69.2
art 61
portrait 58.1
posing 51.6
picture frame 21.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 17-29
Gender Female, 98.8%
Happy 0.1%
Disgusted 0.2%
Fear 0.1%
Calm 97.9%
Angry 0.3%
Sad 1%
Confused 0.3%
Surprised 0.2%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a person posing for a photo 56.6%
a person posing for a photo 49.1%
an old photo of a person 49%

Text analysis

Amazon

a
FRETIMONY
C0XO
DA2E
BIRNT QUI FRETIMONY DA2E
QUI
al
BIRNT
Le

Google

ONIUM
BUNT
ONIUM DANTN COstO K BUNT QU H.42
H.42
K
QU
DANTN
COstO