Human Generated Data

Title

Copy after Antique Relief (Two Scenes)

Date

17th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Haldeman, 1979.23

Human Generated Data

Title

Copy after Antique Relief (Two Scenes)

People

Artist: Unidentified Artist,

Date

17th century

Classification

Drawings

Machine Generated Data

Tags

Imagga
created on 2018-12-21

sketch 100
representation 100
drawing 100
vintage 38.1
grunge 33.2
ancient 31.2
old 30
antique 29.4
art 28
retro 27
texture 26.4
paper 25.1
aged 21.7
design 19.1
pattern 18.5
frame 16.7
decoration 15.7
floral 15.3
decorative 14.2
artistic 13.9
wallpaper 13.8
letter 12.8
stamp 12.8
flower 12.3
style 11.9
postmark 11.8
postage 11.8
postal 11.8
paint 11.8
history 11.6
mail 11.5
creative 11.5
textured 11.4
artwork 11
black 10.8
currency 10.8
material 10.7
grungy 10.4
ornament 10.4
close 10.3
color 10
border 10
dirty 9.9
detail 9.7
worn 9.5
graphic 9.5
symbol 9.4
money 9.4
historic 9.2
bank 9
text 8.7
structure 8.7
post 8.6
curl 8.6
canvas 8.5
finance 8.5
cash 8.2
global 8.2
brown 8.1
fracture 7.8
leaf 7.8
wall 7.7
scroll 7.6
plant 7.5
page 7.4
backdrop 7.4
closeup 7.4
note 7.4
painting 7.2

Google
created on 2018-12-21

art 86.8
figure drawing 81.2
history 80.2
ancient history 74.5
drawing 70.5
human 70
relief 68.4
artwork 65.8
mythology 59.3
middle ages 57.2
tree 54.6
sketch 53.7
visual arts 52

Microsoft
created on 2018-12-21

text 100
sculpture 98.7
book 98.7
relief 60.7
sarcophagus 32.5
marble 31.9
ancient 29.1
archaeology 27.6
museum 18.4
statue 14.4
sketch 13.2

Captions

Microsoft

a close up of a book 37.5%
close up of a book 32.5%
a hand holding a book 32.4%