Human Generated Data

Title

Musicians and Learned Men Entertaining the Rich Woman

Date

18th century

People

Artist: Philibert Benoit de Larue, French 1718 - 1780

Artist after: Louis Binet, French 1744 - 1800

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of W. G. Russell Allen and Paul J. Sachs, M9252

Human Generated Data

Title

Musicians and Learned Men Entertaining the Rich Woman

People

Artist: Philibert Benoit de Larue, French 1718 - 1780

Artist after: Louis Binet, French 1744 - 1800

Date

18th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of W. G. Russell Allen and Paul J. Sachs, M9252

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Person 98.8
Human 98.8
Art 98.3
Person 96.3
Painting 96
Person 87.9
Bird 80
Animal 80
Person 79.3
Mammal 79.2
Cat 79.2
Pet 79.2

Clarifai
created on 2019-11-09

people 99.9
art 96.7
print 96
illustration 95.3
adult 95
group 94.8
painting 94.7
one 94.3
man 93.8
leader 87.8
portrait 86.7
wear 86.5
text 85.7
military 82.8
no person 82.6
retro 81
engraving 79
two 78.3
soldier 78.1
war 78

Imagga
created on 2019-11-09

bookmark 41.4
book jacket 37.4
old 34.2
vintage 34.1
jacket 30.1
antique 25.1
grunge 23.8
graffito 23.7
wrapping 22.1
retro 22.1
decoration 20.7
texture 20.2
stamp 20.1
ancient 19.9
paper 19.6
aged 18.1
wall 17.1
covering 16.1
letter 15.6
mail 15.3
door 15.2
postage 14.7
art 14.6
design 14
frame 13.3
postmark 12.8
stone 12.4
lighter 11.8
post 11.4
binding 11
philately 10.9
postal 10.8
device 10.7
travel 10.6
pattern 10.3
envelope 10.1
dirty 9.9
brown 9.6
culture 9.4
architecture 9.4
building 8.9
symbol 8.8
parchment 8.6
worn 8.6
card 8.5
house 8.4
page 8.4
structure 8.3
global 8.2
religion 8.1
history 8.1
world 8
country 8
correspondence 7.8
entrance 7.7
blank 7.7
spotted 7.7
board 7.7
rustic 7.7
temple 7.6
book 7.6
historic 7.3
artwork 7.3
rough 7.3
container 7.3
box 7.2
material 7.1

Google
created on 2019-11-09

Microsoft
created on 2019-11-09

drawing 94.8
cartoon 92.7
text 86.8
indoor 86.4
person 80.6
old 79.6
clothing 72.5
sketch 71.9
white 71
mammal 50.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Male, 53.9%
Fear 45.1%
Happy 47.5%
Surprised 45.1%
Angry 50.4%
Sad 45.1%
Confused 45.1%
Calm 46.4%
Disgusted 45.2%

AWS Rekognition

Age 15-27
Gender Male, 50.4%
Calm 50%
Angry 49.8%
Sad 49.6%
Fear 49.5%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%

AWS Rekognition

Age 23-35
Gender Male, 50.4%
Fear 49.5%
Sad 49.5%
Angry 50.5%
Happy 49.5%
Calm 49.5%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 35-51
Gender Male, 52.3%
Fear 45%
Sad 45.2%
Confused 45%
Angry 45%
Disgusted 45%
Calm 54.8%
Happy 45%
Surprised 45%

AWS Rekognition

Age 33-49
Gender Male, 54.4%
Surprised 45%
Confused 45%
Calm 54%
Happy 45.7%
Angry 45.2%
Sad 45%
Fear 45%
Disgusted 45%

Feature analysis

Amazon

Person 98.8%
Bird 80%
Cat 79.2%

Categories

Captions

Microsoft
created on 2019-11-09

an old photo of a cat 25.8%
an old photo of a person 25.7%
an old photo of a box 25.6%

Text analysis

Amazon

APAYSANNE
PERVeRTIE
APAYSANNE E
E
Tom 3 . 6 P.
64

Google

Tom 3. 6 P APAYSANN PERVERTI
Tom
3.
6
P
APAYSANN
PERVERTI