Human Generated Data

Title

Venus and Mercury

Date

ca. 1600

People

Artist: Jan Harmensz. Muller, Dutch 1571 - 1628

Artist after: Bartholomeus Spranger, Netherlandish 1546 - 1611

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R1975

Human Generated Data

Title

Venus and Mercury

People

Artist: Jan Harmensz. Muller, Dutch 1571 - 1628

Artist after: Bartholomeus Spranger, Netherlandish 1546 - 1611

Date

ca. 1600

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R1975

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Painting 99.5
Art 99.5
Human 82.5
Person 82.5
Person 61

Clarifai
created on 2019-10-30

people 99.3
art 99.3
illustration 97.6
print 96.5
group 96.1
Renaissance 94.6
adult 94.4
painting 93.7
man 92.9
religion 87.6
one 85.7
antique 85.5
nude 85
no person 84.4
old 84.1
baroque 84
ancient 80.6
saint 80.6
many 80.3
engraving 78.2

Imagga
created on 2019-10-30

sculpture 96.3
carving 74.2
statue 55.8
art 53.4
ancient 36.4
plastic art 35.4
religion 33.2
architecture 32.3
monument 30.9
figure 30.5
temple 30.1
history 28.7
stone 27.8
culture 26.5
old 24.4
religious 24.4
travel 22.6
god 22
historic 21.1
landmark 20.8
city 19.1
tourism 19
historical 17.9
marble 16.9
famous 16.8
holy 16.4
decoration 15.9
church 15.7
heritage 14.5
relief 14.4
spirituality 14.4
spiritual 14.4
column 14.3
building 14.3
carved 12.7
catholic 12.7
antique 12.6
sketch 12.5
detail 12.1
tourist 11.8
angel 11.7
golden 11.2
design 10.8
king 10.7
roman 10.2
symbol 10.1
drawing 10
traditional 10
museum 10
worship 9.7
palace 9.6
saint 9.6
faith 9.6
artistic 9.6
east 9.4
holiday 9.3
face 9.2
facade 8.7
cemetery 8.3
vintage 8.3
world 8.2
representation 8.1
sculptures 7.9
pray 7.8
bronze 7.6
capital 7.6
cross 7.5
decorative 7.5
style 7.4
exterior 7.4
cathedral 7

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

text 100
book 99.9
drawing 95.2
sketch 93.7
art 83.6
person 81
painting 72.2
human face 54.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 10-20
Gender Male, 94.2%
Surprised 0.1%
Disgusted 0%
Angry 4.2%
Fear 0.2%
Calm 94.5%
Happy 0.3%
Sad 0.6%
Confused 0.1%

AWS Rekognition

Age 22-34
Gender Female, 58.8%
Confused 0.1%
Disgusted 0.1%
Calm 94.5%
Angry 1.6%
Happy 2.3%
Sad 0.8%
Surprised 0.2%
Fear 0.3%

AWS Rekognition

Age 37-55
Gender Male, 54.9%
Disgusted 45.1%
Happy 45%
Fear 45.1%
Angry 50.7%
Surprised 45.3%
Calm 48.7%
Sad 45%
Confused 45.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.5%
Person 82.5%

Categories

Text analysis

Amazon

Veneris
pocula
Sic
Bacchi,
ST
nouit
Exemplum
plectro
viuis
faat
8Ad
curorGllenies
et
8Ad Veneris furtum facuuntrr pocula Bacchi, Exemplum Ut viuis ST Ida curorGllenies artis,
artis,
buiuis
B
diferia futo:
Sic faat et plectro lingua diferia futo: nouit perenmis aquis Mule
furtum
facuuntrr
aquis
lingua
perenmis
Spaym
Ida
Mule
Ut
Spaym B Jrr Tcl fy
Jrr
fy
Tcl

Google

Fnd Maller fouly Exemplum &t buus cufor Gllenus artis Vt nouit vius Ida perennis aqus Ad Veneris furtum faciant vt poaula Bacchi et
Fnd
Maller
buus
Gllenus
Vt
nouit
vius
Ida
perennis
aqus
Ad
Veneris
furtum
vt
poaula
Bacchi
et
fouly
Exemplum
&t
cufor
artis
faciant