Human Generated Data

Title

Portrait of a Man in a Fur Hat

Date

19th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, The Melvin R. Seiden Fund, Louise Haskell Daly Fund and Paul J. Sachs Memorial Fund, 1985.85

Human Generated Data

Title

Portrait of a Man in a Fur Hat

People

Artist: Unidentified Artist,

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, The Melvin R. Seiden Fund, Louise Haskell Daly Fund and Paul J. Sachs Memorial Fund, 1985.85

Machine Generated Data

Tags

Amazon
created on 2020-04-30

Art 95.1
Human 94.4
Person 94.4
Drawing 91.7
Sketch 81.7
Painting 80.8
Face 73
Photo 63.8
Photography 63.8
Portrait 61.8

Clarifai
created on 2020-04-30

portrait 99.7
old 99.6
antique 99.4
vintage 99.1
people 98.9
art 98.3
print 94.6
illustration 94.5
paper 94.1
wear 93.6
retro 93.6
sepia pigment 93.1
adult 92.7
one 92.4
leader 92.4
engraving 92.3
ancient 90.7
painting 89.2
visuals 88.8
desktop 88.2

Imagga
created on 2020-04-30

sketch 100
drawing 100
representation 100
currency 27
money 26.4
ancient 26
paper 25.9
old 25.1
cash 24.7
vintage 23.2
art 20.7
grunge 20.4
dollar 19.5
close 19.4
banking 19.3
antique 19.1
bill 19
retro 18.9
finance 18.6
texture 18.1
aged 17.2
bank 17
wealth 15.3
financial 14.3
stucco 14
savings 14
business 14
frame 13.3
banknotes 12.7
one 12.7
dollars 12.6
us 12.5
architecture 12.5
rich 12.1
sculpture 11.7
portrait 11.7
pattern 11.6
exchange 11.5
grungy 11.4
stone 11.3
design 11.3
decoration 11.2
economy 11.1
wall 11.1
culture 11.1
paint 10.9
history 10.7
banknote 10.7
face 10.7
pay 10.6
closeup 10.1
rough 10
border 10
dirty 9.9
hundred 9.7
finances 9.6
loan 9.6
man 9.4
letter 9.2
religion 9
twenty 8.9
detail 8.9
textured 8.8
notes 8.6
blank 8.6
statue 8.6
head 8.4
investment 8.3
brown 8.1
concepts 8
postmark 7.9
text 7.9
male 7.8
space 7.8
economic 7.8
stamp 7.7
payment 7.7
mail 7.7
god 7.7
card 7.7
worn 7.6
historical 7.5
decorative 7.5
church 7.4
figure 7.3
color 7.2
market 7.1

Google
created on 2020-04-30

Microsoft
created on 2020-04-30

sketch 99.9
drawing 99.9
child art 95.3
painting 91.9
text 91.6
old 89.8
art 86.2
human face 83.7
black 82.7
illustration 69.1
cartoon 64.7
picture frame 6.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 22-34
Gender Male, 84.7%
Disgusted 0.8%
Sad 0.8%
Angry 0.5%
Happy 5.6%
Calm 87.6%
Fear 0.2%
Confused 2.1%
Surprised 2.3%

Microsoft Cognitive Services

Age 32
Gender Female

Feature analysis

Amazon

Person 94.4%

Categories

Imagga

paintings art 97.9%
food drinks 1.4%

Captions