Human Generated Data

Title

Peasant Woman

Date

People
Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, 1898.593

Human Generated Data

Title

Peasant Woman

People
Date

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Human 92.3
Person 89.6
Art 88.7
Painting 88.7
Face 83.3
Drawing 76.9
Head 74.5
Portrait 71.4
Photography 71.4
Photo 71.4
Sketch 63.5
Skin 58.6
Money 55.4

Clarifai
created on 2019-05-30

people 99.9
portrait 99.5
art 99.5
adult 99.2
one 99.1
print 98.7
engraving 98.1
painting 98
man 97.5
leader 97.2
old 94.3
illustration 93.8
elderly 92.9
writer 92.2
poet 85.6
facial hair 84.9
veil 84
scientist 83.8
administration 82.7
wear 82.2

Imagga
created on 2019-05-30

sketch 100
drawing 95.6
representation 89
money 39.2
currency 37.7
cash 35.7
dollar 32.5
bill 27.6
banking 27.6
wealth 26.1
paper 25.9
finance 25.4
bank 25.1
one 23.9
close 23.4
portrait 22.7
financial 22.3
business 20.7
dollars 20.3
us 19.3
hundred 18.4
face 17.8
savings 17.7
man 17.5
franklin 16.8
rich 15.8
banknotes 15.7
banknote 15.6
art 15.5
pay 15.4
exchange 15.3
loan 14.4
male 14.2
economy 13.9
finances 13.5
closeup 13.5
bills 12.6
old 12.6
mug shot 11.8
human 11.3
people 11.2
church 11.1
photograph 11.1
note 11
investment 11
head 10.9
statue 10.9
economic 10.7
payment 10.6
god 10.5
person 10.5
commerce 10.3
culture 10.3
sculpture 10.1
market 9.8
success 9.7
profit 9.6
antique 9.5
ancient 9.5
adult 9.1
religion 9
funds 8.8
notes 8.6
expression 8.5
vintage 8.3
symbol 8.1
history 8.1
concepts 8
twenty 7.9
president 7.9
bible 7.8
value 7.8
price 7.7
serious 7.6
capital 7.6
sign 7.5
black 7.2
museum 7.2
icon 7.1

Google
created on 2019-05-30

Face 95.8
Head 89.8
Drawing 89.4
Forehead 88.5
Portrait 87.7
Sketch 83.8
Self-portrait 83.6
Art 80
Painting 71.9
Illustration 65.5
Visual arts 64.9
Artwork 62.9
Wrinkle 61.2
Jaw 57.4

Microsoft
created on 2019-05-30

text 100
sketch 99.9
drawing 99.9
book 99.6
painting 98.8
human face 96.2
child art 92
art 90.9
person 84
portrait 75.9
illustration 75

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 94.8%
Confused 7.8%
Happy 14.9%
Angry 31.9%
Sad 23.5%
Surprised 7.4%
Calm 7.2%
Disgusted 7.3%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 89.6%
Painting 88.7%

Captions

Microsoft

an old photo of a man 83.1%
a close up of a man holding a book 42%
a man holding a book 41.9%