Human Generated Data

Title

Man with a Neckpiece, Seen from the Back

Date

1617

People

Artist: Jacques Callot, French 1592 - 1635

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, by exchange, S4.9.4

Human Generated Data

Title

Man with a Neckpiece, Seen from the Back

People

Artist: Jacques Callot, French 1592 - 1635

Date

1617

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, by exchange, S4.9.4

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Human 90.9
Person 90.9
Art 88.1
Drawing 84.6
Painting 82.9
Person 75.7
Sketch 69.9
Duel 62.4
Hand 59

Clarifai
created on 2018-04-19

people 100
print 100
illustration 99.9
engraving 99.7
art 99.6
group 99.4
adult 99.3
woodcut 98.9
leader 98.5
man 98.4
wear 98.2
facial hair 97.3
veil 96.9
etching 96.6
two 96.4
lithograph 94.6
one 94.1
military 94
administration 93.1
royalty 92.5

Imagga
created on 2018-04-19

sketch 100
drawing 100
representation 91.7
vintage 43.9
old 33.5
grunge 32.4
retro 31.2
ancient 28.6
antique 26.9
aged 25.4
paper 25.1
texture 24.3
decoration 21.4
art 20.2
graffito 19.2
stamp 18.9
frame 18.3
design 16.9
envelope 15.5
mail 15.3
pattern 15.1
postmark 14.8
wallpaper 13
letter 12.9
postage 12.8
dirty 12.7
flower 12.3
floral 11.9
damaged 11.5
old fashioned 11.4
brown 11
postal 10.8
silhouette 10.8
material 10.7
worn 10.5
style 10.4
grain 10.2
global 10
philately 9.9
grime 9.8
mottled 9.8
backgrounds 9.7
decay 9.7
post 9.5
graphic 9.5
canvas 9.5
grungy 9.5
map 9.5
symbol 9.4
page 9.3
decorative 9.2
paint 9.1
currency 9
history 9
structure 8.8
fracture 8.8
faded 8.8
geography 8.7
stain 8.7
money 8.5
card 8.5
finance 8.5
black 8.4
note 8.3
rough 8.2
border 8.1
world 8
textured 7.9
ornament 7.8
blank 7.7
parchment 7.7
business 7.3
painting 7.2
collection 7.2
bank 7.2

Google
created on 2018-04-19

Microsoft
created on 2018-04-19

text 99.1
book 96.5

Color Analysis

Feature analysis

Amazon

Person 90.9%
Painting 82.9%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2018-04-19

a close up of a book 38.1%
close up of a book 32.6%
a hand holding a book 32.5%