Human Generated Data

Title

Illustrations to "Emblemata ex Horatio Flacco"

Date

16th-17th century

People

Artist: Gysbrecht van Veen, Dutch 1558/62 - 1625

Artist after: Otto van Veen, Flemish 1556 - 1629

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R1003NA

Human Generated Data

Title

Illustrations to "Emblemata ex Horatio Flacco"

People

Artist: Gysbrecht van Veen, Dutch 1558/62 - 1625

Artist after: Otto van Veen, Flemish 1556 - 1629

Date

16th-17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R1003NA

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Person 99.7
Human 99.7
Person 99
Person 97.5
Person 96.9
Art 95.8
Person 95.7
Painting 91.6

Clarifai
created on 2019-08-10

people 100
print 99.8
art 99.7
illustration 99.3
adult 98.9
engraving 98.6
group 98.5
man 98.1
painting 96.7
facial hair 96.5
leader 96.2
two 94.9
woman 94.7
affection 93.8
portrait 93.7
lithograph 93.7
Renaissance 90.4
administration 90.4
royalty 89.7
baroque 88.8

Imagga
created on 2019-08-10

column 42.1
statue 36.8
sculpture 33.1
art 27.2
architecture 26.1
religion 24.2
ancient 23.4
culture 20.5
stone 19.4
fountain 19
history 18.8
old 18.1
historical 17.9
monument 17.7
religious 16.9
god 16.3
travel 16.2
building 16
famous 15.8
temple 15.6
historic 15.6
decoration 14.9
carving 13.7
tourism 13.2
church 12.9
antique 12.9
detail 12.9
city 12.5
structure 11.7
catholic 11.7
holy 11.6
figure 11.2
marble 11.1
book jacket 10.5
decorative 10
landmark 9.9
portrait 9.7
pray 9.7
spirituality 9.6
spiritual 9.6
theater curtain 9.5
symbol 9.4
face 9.2
vintage 9.1
statues 8.9
worship 8.7
palace 8.7
design 8.6
covering 8.4
traditional 8.3
person 8.2
tourist 8.2
jacket 8.1
carved 7.8
people 7.8
golden 7.7
prayer 7.7
curtain 7.6
man 7.5
east 7.5
gold 7.4
dress 7.2
romantic 7.1
roman 7.1

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

text 99.8
book 98.7
person 94.7
drawing 94.5
clothing 93.6
sketch 89.6
painting 82.2
art 58.1
human face 55.4
man 53.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-42
Gender Male, 95.9%
Angry 7.5%
Fear 2%
Confused 1.2%
Calm 11.5%
Happy 0.4%
Disgusted 0.3%
Surprised 0.3%
Sad 76.7%

AWS Rekognition

Age 21-33
Gender Male, 54.2%
Sad 45.7%
Angry 45.7%
Happy 45.4%
Surprised 45%
Calm 53%
Confused 45%
Fear 45.1%
Disgusted 45%

AWS Rekognition

Age 26-42
Gender Male, 54.9%
Fear 45.1%
Surprised 45.1%
Happy 46%
Confused 45.1%
Disgusted 45.3%
Angry 47.4%
Calm 50.7%
Sad 45.4%

AWS Rekognition

Age 18-30
Gender Male, 50.7%
Disgusted 45%
Sad 46.5%
Angry 48.2%
Happy 45.6%
Surprised 45.2%
Fear 45.5%
Calm 48.9%
Confused 45%

Microsoft Cognitive Services

Age 39
Gender Female

Feature analysis

Amazon

Person 99.7%
Painting 91.6%

Captions

Microsoft
created on 2019-08-10

a person holding a book 34.3%
an old photo of a baby 34.2%