Human Generated Data

Title

Illustration to Tristram Shandy, VI Th., p.11: A Couple at a Child's Bedside, the Woman Shushing the Man

Date

18th-19th century

People

Artist: Daniel Berger, German 1744 - 1824

Artist after: Daniel Nikolaus Chodowiecki, German 1726 - 1801

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R14541

Human Generated Data

Title

Illustration to Tristram Shandy, VI Th., p.11: A Couple at a Child's Bedside, the Woman Shushing the Man

People

Artist: Daniel Berger, German 1744 - 1824

Artist after: Daniel Nikolaus Chodowiecki, German 1726 - 1801

Date

18th-19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R14541

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Person 99.5
Human 99.5
Person 98.5
Person 97.1
Art 95.9
Painting 94.3
Person 89.2
Person 73.2
Person 68.6
Animal 66.1
Horse 66.1
Mammal 66.1
Poster 60
Advertisement 60

Clarifai
created on 2019-08-10

people 99.9
print 99.9
illustration 99.4
art 99
group 98.4
engraving 98
adult 97.6
man 97.5
wear 94.8
vintage 93.8
leader 92.1
woodcut 92
old 90.8
antique 90.1
retro 89.9
two 89.5
painting 89.2
one 88.5
portrait 87.9
woman 87.9

Imagga
created on 2019-08-10

grunge 37.5
old 33.4
vintage 33.1
antique 32.4
art 28.4
texture 27.8
retro 26.2
aged 24.4
ancient 23.4
dirty 21.7
grungy 20.9
structure 19.2
frame 19.2
decoration 18.6
stone 18.6
paper 18.1
sculpture 17.7
statue 17.6
architecture 17.4
paint 17.2
material 17
pattern 16.4
detail 16.1
wall 15.9
textured 15.8
rough 15.5
memorial 15.4
worn 15.3
damaged 15.3
design 14.7
border 14.5
mosaic 14.3
history 14.3
decorative 14.2
surface 14.1
historic 13.8
culture 13.7
weathered 13.3
building 13.3
monument 13.1
artistic 13
stucco 12.9
brown 12.5
historical 12.2
fountain 11.6
torn 11.6
landmark 10.8
rust 10.6
parchment 10.6
old fashioned 10.5
blank 10.3
famous 10.2
black 10.2
page 10.2
city 10
gravestone 9.9
tourism 9.9
religion 9.9
travel 9.9
close 9.7
stained 9.6
painted 9.5
transducer 9.5
card 9.5
flower 9.2
wallpaper 9.2
ornate 9.2
marble 8.9
color 8.9
symbol 8.8
stamp 8.7
messy 8.7
stain 8.6
religious 8.4
lace 8.3
element 8.3
letter 8.3
style 8.2
romantic 8
postmark 7.9
urban 7.9
scratch 7.8
empty 7.7
temple 7.7
aging 7.7
mail 7.7
floral 7.7
rusty 7.6
arch 7.4
backdrop 7.4
sketch 7.4
global 7.3
tourist 7.3
electrical device 7.1
column 7

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

text 99.5
clothing 97.6
person 96.7
book 92.6
woman 89
drawing 76.6
engraving 63.7
old 59.3
photograph 53.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-32
Gender Female, 53.4%
Disgusted 45%
Calm 48.6%
Fear 45%
Angry 45.1%
Happy 51%
Confused 45%
Sad 45.1%
Surprised 45.2%

AWS Rekognition

Age 23-35
Gender Male, 53.3%
Confused 45%
Disgusted 45%
Sad 45%
Happy 45%
Surprised 45%
Angry 45%
Calm 54.9%
Fear 45%

AWS Rekognition

Age 32-48
Gender Male, 50.2%
Angry 49.5%
Disgusted 49.5%
Fear 49.5%
Sad 50%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Calm 50%

AWS Rekognition

Age 32-48
Gender Female, 52.6%
Happy 45.1%
Calm 48.7%
Disgusted 45.4%
Fear 46.7%
Surprised 45.8%
Confused 46.1%
Angry 46.4%
Sad 45.9%

Feature analysis

Amazon

Person 99.5%
Horse 66.1%
Poster 60%

Categories

Text analysis

Amazon

h
134
VII.Th. pag.134 h
D.BergerJa
VII.Th. pag.134
X

Google

VIITh.pag.134 D.Berger Je
VIITh.pag.134
D.Berger
Je