Human Generated Data
Title
Sketchbook
Date
1987
People
Artist: Raphael Soyer, American 1899 - 1987
Classification
Drawings
Credit Line
Harvard Art Museums/Fogg Museum, Bequest of Raphael Soyer, 1988.453
Human Generated Data
Title
Sketchbook
People
Artist: Raphael Soyer, American 1899 - 1987
Date
1987
Classification
Drawings
Credit Line
Harvard Art Museums/Fogg Museum, Bequest of Raphael Soyer, 1988.453
Machine Generated Data
Tags
Amazon
created on 2020-04-30
Art
98.4
Drawing
98.4
Human
98.4
Sketch
96.3
Person
93
Accessory
66.7
Glasses
66.7
Accessories
66.7
Face
61.6
Advertisement
56.7
Poster
56.7
Text
55.9
Clarifai
created on 2020-04-30
art
99.2
people
99.1
illustration
99.1
chalk out
98.6
man
97.8
print
97.6
portrait
97.3
adult
96.6
paper
95.8
painting
94.8
one
94.2
visuals
93.4
engraving
90.6
wear
90.1
retro
89
old
88.9
vintage
87.4
etching
87
leader
84.8
antique
84.1
Imagga
created on 2020-04-30
sketch
100
drawing
100
representation
100
art
22.8
vintage
20.7
retro
20.5
old
18.1
paper
17.3
stamp
16.5
grunge
15.3
mail
15.3
ancient
14.7
letter
14.7
money
14.5
design
14.1
black
13.8
postmark
13.8
postage
13.8
antique
13
postal
12.8
style
12.6
pattern
12.3
currency
11.7
silhouette
11.6
symbol
10.8
man
10.8
line
10.3
close
10.3
finance
10.1
cash
10.1
philately
9.9
history
9.8
texture
9.7
detail
9.7
ink
9.6
artistic
9.6
post
9.5
graphic
9.5
dollar
9.3
decorative
9.2
shape
9
envelope
8.8
curve
8.8
figures
8.7
clip art
8.3
frame
8.3
closeup
8.1
financial
8
decoration
8
creative
7.9
business
7.9
wallpaper
7.7
curl
7.6
bill
7.6
rich
7.4
banking
7.4
note
7.4
global
7.3
paint
7.2
wealth
7.2
bank
7.2
Google
created on 2020-04-30
Drawing
97.3
Sketch
96.3
Cartoon
82
Illustration
80.7
Art
78.9
Figure drawing
78.3
Artwork
75.5
Black-and-white
74.4
Line art
71
Self-portrait
66
Visual arts
64.9
Portrait
53.5
Fictional character
52.8
Microsoft
created on 2020-04-30
sketch
99.9
drawing
99.8
text
98
art
94.6
child art
91.2
illustration
86.8
cartoon
73.7
human face
72
picture frame
9.2
Color Analysis
Face analysis
Amazon
AWS Rekognition
Age
49-67
Gender
Female, 53.9%
Calm
47.1%
Happy
1.8%
Disgusted
2.4%
Sad
1%
Angry
14.4%
Fear
2.4%
Surprised
28.6%
Confused
2.3%
Feature analysis
Amazon
Person
Glasses
Poster
❮
❯
Person
93%
❮
❯
Glasses
66.7%
❮
❯
Poster
56.7%
Categories
Imagga
paintings art
98.7%
text visuals
1.1%
Captions
Microsoft
created on 2020-04-30
a close up of text on a white background
78.2%
a close up of text on a black background
74.2%
a close up of text on a white surface
74.1%
Text analysis
Amazon
RAPHAEL
maine
SOYER
PoTrat
vnaehawen
Saes
Google
Self PorTrat Venalhaven maine RAPHAEL SOYER
Self
PorTrat
Venalhaven
maine
RAPHAEL
SOYER