Human Generated Data

Title

Sketchbook

Date

1981

People

Artist: Raphael Soyer, American 1899 - 1987

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Raphael Soyer, 1988.450

Human Generated Data

Title

Sketchbook

People

Artist: Raphael Soyer, American 1899 - 1987

Date

1981

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Raphael Soyer, 1988.450

Machine Generated Data

Tags

Amazon
created on 2020-04-30

Human 98
Drawing 97.9
Art 97.9
Person 96.3
Sketch 95.4
Accessory 87.9
Accessories 87.9
Glasses 87.9
Face 57.8

Clarifai
created on 2020-04-30

portrait 99.6
old 98.8
art 98.8
people 98.1
one 97.8
engraving 96.8
paper 96.4
illustration 95.4
man 94.6
painting 94.5
writer 94.4
adult 93.3
print 93.1
antique 92.4
elderly 91.9
vintage 89.3
etching 88.4
mustache 87.9
leader 86.9
visuals 86.9

Imagga
created on 2020-04-30

sketch 100
drawing 100
representation 100
money 44.3
currency 41.3
cash 40.3
dollar 39
bill 31.4
finance 28.8
bank 28.7
banking 27.6
paper 27.5
one 26.9
close 25.7
wealth 25.2
dollars 25.1
financial 25
business 24.3
hundred 22.3
savings 20.5
us 20.3
franklin 18.7
portrait 18.1
banknote 17.5
finances 17.4
banknotes 15.7
face 15.6
pay 15.4
loan 15.4
exchange 15.3
rich 14.9
bills 14.6
man 13.5
old 12.5
closeup 12.1
economy 12.1
investment 11.9
head 11.8
payment 11.6
notes 11.5
art 11.1
note 11
male 10.6
states 10.6
profit 10.5
success 10.5
commerce 10.3
culture 10.3
vintage 9.9
funds 9.8
economic 9.7
capital 9.5
people 9.5
concepts 8.9
market 8.9
printed 8.9
wages 8.8
stamp 8.7
price 8.7
mail 8.6
united 8.6
adult 8.4
person 8.3
human 8.3
symbol 8.1
hair 7.9
masterpiece 7.9
postmark 7.9
twenty 7.9
president 7.9
postage 7.9
value 7.8
debt 7.7
expression 7.7
number 7.5
letter 7.3
black 7.2

Google
created on 2020-04-30

Microsoft
created on 2020-04-30

sketch 99.9
drawing 99.9
child art 96.2
art 94.9
text 88.7
human face 87.9
painting 85.4
cartoon 76.1
illustration 67.1
portrait 55.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 62-78
Gender Male, 65.8%
Surprised 6.4%
Happy 0.5%
Calm 24.7%
Fear 6.9%
Sad 44.6%
Disgusted 3.9%
Confused 5.5%
Angry 7.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.3%
Glasses 87.9%

Categories

Imagga

paintings art 99.7%

Text analysis

Amazon

RAPHAEL
Sess
PTrait
SOyPA

Google

Sels A PHAEL SoyeA
Sels
A
PHAEL
SoyeA