Human Generated Data

Title

Man, Woman, and Dog

Date

17th century

People

Artist: Pieter Jansz. Quast, Dutch 1605/6 - 1647

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3168

Human Generated Data

Title

Man, Woman, and Dog

People

Artist: Pieter Jansz. Quast, Dutch 1605/6 - 1647

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3168

Machine Generated Data

Tags

Amazon
created on 2019-08-07

Person 98.6
Human 98.6
Person 93.1
Art 85.7
Clothing 75.7
Hat 75.7
Apparel 75.7
Painting 73.1
Leisure Activities 57.8

Clarifai
created on 2019-08-07

people 99.9
illustration 99.9
print 99.9
art 99.5
engraving 99.4
adult 98.4
two 96.5
man 96.4
lithograph 96.4
woodcut 96.1
etching 96
wear 96
antique 95.8
vintage 94.7
sepia pigment 94.4
painting 94.1
weapon 92.8
facial hair 92.4
visuals 92.3
book bindings 92.2

Imagga
created on 2019-08-07

sketch 78.1
drawing 61.5
representation 46.6
art 27.5
old 21.6
retro 21.3
antique 21.2
book jacket 20.2
grunge 19.6
design 19.4
vintage 18.2
ancient 17.3
jacket 15.7
paper 15.7
decoration 15.1
black 15
frame 15
pattern 13.7
style 13.3
decorative 12.5
texture 12.5
graphic 12.4
artistic 12.2
ornament 12.1
wrapping 11.9
architecture 11.7
color 11.7
aged 10.9
painting 10.8
symbol 10.8
man 10.7
religion 10.7
statue 10.6
tattoo 10.3
sculpture 10.2
flower 10
culture 9.4
religious 9.4
light 9.3
covering 9.3
close 9.1
detail 8.8
textured 8.8
floral 8.5
travel 8.4
creativity 8.4
page 8.3
traditional 8.3
element 8.3
backdrop 8.2
ornate 8.2
fantasy 8.1
backgrounds 8.1
body 8
shape 7.9
sepia 7.8
wall 7.7
card 7.6
oriental 7.5
classic 7.4
person 7.4
closeup 7.4
artwork 7.3
paint 7.2
dirty 7.2
holiday 7.2
history 7.2
building 7.1
tree 7.1

Google
created on 2019-08-07

Microsoft
created on 2019-08-07

text 100
book 99.8
sketch 99.2
drawing 98.9
cartoon 96.6
person 79.6
illustration 65.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-44
Gender Male, 96.5%
Happy 1.7%
Calm 5%
Angry 17%
Confused 7.9%
Disgusted 47.7%
Sad 18%
Surprised 2.7%

Feature analysis

Amazon

Person 98.6%
Hat 75.7%
Painting 73.1%

Categories

Captions

Microsoft
created on 2019-08-07

a close up of a book 57.3%
close up of a book 50.9%
a hand holding a book 50.8%

Text analysis

Amazon

R