Human Generated Data

Title

Book V.47. Capital and citadel (Tarpean rock) of Rome taken by Gauls

Date

1493

People

Artist: Anonymous Germany, German

Artist: Anonymous Italy (Venice) 1493, Italian, Venetian

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Philip Hofer, M5400

Human Generated Data

Title

Book V.47. Capital and citadel (Tarpean rock) of Rome taken by Gauls

People

Artist: Anonymous Germany, German

Artist: Anonymous Italy (Venice) 1493, Italian, Venetian

Date

1493

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-03

Text 95.1
Label 92.9
Drawing 90.2
Doodle 90.2
Art 90.2
Alphabet 55.1

Clarifai
created on 2019-11-03

paper 96.8
vintage 96.7
retro 96.3
old 95.8
collection 95.2
art 94
illustration 93.3
ancient 92.6
chalk out 92.3
symbol 92.1
antique 91.6
correspondence 91.5
print 91.5
letter 91.4
document 88.8
desktop 88.3
people 87.9
postal 86.6
pastime 82.1
ink 81.9

Imagga
created on 2019-11-03

envelope 58.7
sketch 56.6
drawing 49.8
vintage 42.2
representation 36
old 33.5
retro 32.8
ancient 31.2
paper 29.1
stamp 28
graffito 27.9
letter 23.9
aged 23.6
mail 23
grunge 23
antique 22.9
decoration 22.9
postmark 21.7
art 21.1
postage 19.7
container 19.4
postal 18.7
texture 16.7
brass 16.4
money 16.2
structure 16.2
currency 15.3
philately 14.8
finance 14.4
memorial 14.1
design 14.1
card 13
circa 12.8
symbol 12.8
history 12.5
post 12.4
map 12
cash 11.9
printed 11.8
collection 10.8
old fashioned 10.5
graphic 10.2
page 10.2
message 10.1
travel 9.9
bank 9.9
shows 9.9
financial 9.8
banking 9.2
note 9.2
global 9.1
post office 8.9
address 8.8
close 8.6
culture 8.6
business 8.5
frame 8.5
dollar 8.4
man 8.1
world 8
black 7.8
ragged 7.8
faded 7.8
dollars 7.7
wall 7.7
pay 7.7
wallpaper 7.7
worn 7.6
bill 7.6
writing 7.5
pattern 7.5
rich 7.5
paint 7.3
border 7.2
dirty 7.2
wealth 7.2

Google
created on 2019-11-03

Line art 68.3
Art 62.5

Microsoft
created on 2019-11-03

text 99.6
drawing 99.4
sketch 98.8
cartoon 95.6
illustration 77.4
picture frame 32.9

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 51.7%
Disgusted 45%
Confused 45%
Sad 45.2%
Surprised 45.3%
Fear 45.8%
Calm 52.5%
Angry 46%
Happy 45.2%

Captions

Microsoft

a close up of a sign 89.6%
close up of a sign 86.5%
a sign on a wall 78.4%

Text analysis

Amazon

au
re
ga
ba
pe
V
auaacids
IU
s
dcuro
mi
s ga dcuro auaacids LCU REaIgOLLE euanl IU
euanl
lu
LCU
REaIgOLLE
C

Google

reigioe
et
C:Din
ad
ga
elai
au
ре
re
b
ga arliacn adaciaieu reigioe elai V ba rbibolm et М C:Din mi ad au ga ре re lu re b ft Thm
adaciaieu
V
ba
rbibolm
М
lu
ft
Thm
arliacn
mi