Human Generated Data

Title

Venus and Cupid

Date

c. 1880

People

Artist: William Perkins Babcock, American 1826 - 1899

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sydney J. Freedberg, 1981.21

Human Generated Data

Title

Venus and Cupid

People

Artist: William Perkins Babcock, American 1826 - 1899

Date

c. 1880

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sydney J. Freedberg, 1981.21

Machine Generated Data

Tags

Amazon
created on 2020-04-30

Art 97.1
Human 91.7
Drawing 88.1
Person 86.3
Painting 82.6
Sketch 76.1
Animal 66.3
Bird 66.3

Clarifai
created on 2020-04-30

people 99.9
print 99.8
art 99.6
adult 98.7
engraving 98.4
portrait 98.3
two 97.5
man 96.9
illustration 96.8
nude 96.7
one 96
mammal 95.8
painting 95.3
canine 95.2
group 94.1
dog 90.9
interaction 90.4
etching 89.7
veil 87.1
visuals 85.9

Imagga
created on 2020-04-30

sketch 100
drawing 95.5
representation 81.5
cash 36.6
money 36.6
currency 35
dollar 30.6
finance 26.2
bank 26
banking 24.8
wealth 24.2
dollars 22.2
bill 21.9
paper 21.2
close 21.1
business 20.1
financial 19.6
savings 19.6
bills 17.5
hundred 17.4
exchange 17.2
one 16.4
pay 16.3
loan 16.3
banknotes 15.7
art 14.7
us 14.5
face 14.2
vintage 14.1
rich 14
investment 13.8
portrait 12.9
franklin 12.8
banknote 12.6
notes 12.5
finances 11.6
sculpture 11.1
economy 11.1
note 11
funds 10.8
payment 10.6
ancient 10.4
man 10.1
people 10
statue 10
envelope 10
postmark 9.9
market 9.8
old 9.8
economic 9.7
stamp 9.7
states 9.7
mail 9.6
capital 9.5
symbol 9.4
culture 9.4
architecture 9.4
letter 9.2
twenty 8.9
printed 8.9
postage 8.8
postal 8.8
wages 8.8
closeup 8.8
newspaper 8.7
united 8.6
product 8.1
circa 7.9
shows 7.9
creation 7.9
president 7.9
design 7.8
profit 7.7
religious 7.5
retro 7.4
success 7.2
religion 7.2
male 7.1

Google
created on 2020-04-30

Microsoft
created on 2020-04-30

text 99.8
drawing 99.6
sketch 99.6
book 97.9
art 94.2
painting 87.1
illustration 86.5
cartoon 83.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 47-65
Gender Male, 52%
Surprised 1.2%
Disgusted 2.1%
Happy 15.8%
Angry 9.7%
Confused 2.5%
Calm 27.5%
Fear 4.8%
Sad 36.4%

AWS Rekognition

Age 31-47
Gender Female, 94.5%
Angry 0.1%
Happy 47.1%
Disgusted 0%
Surprised 0.1%
Fear 0.1%
Calm 51.3%
Confused 0.1%
Sad 1.3%

Feature analysis

Amazon

Person 86.3%
Painting 82.6%
Bird 66.3%

Categories

Imagga

paintings art 79.4%
nature landscape 18.8%
pets animals 1.4%