Human Generated Data

Title

Music Lesson

Date

19th century

People

Artist: Pierre Louis Henri Laurent, French 1779 - 1844

Artist after: Gonzales Coques, Flemish c. 1614-1684

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R9233

Human Generated Data

Title

Music Lesson

People

Artist: Pierre Louis Henri Laurent, French 1779 - 1844

Artist after: Gonzales Coques, Flemish c. 1614-1684

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R9233

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Human 99.1
Person 99.1
Person 99
Person 98
Painting 97.7
Art 97.7

Clarifai
created on 2019-11-10

people 100
group 99.6
furniture 99.5
adult 99.1
room 98
print 97.8
seat 97.3
administration 96.8
leader 96.6
woman 95.1
two 94.8
man 94.5
home 93.9
child 93.5
sit 91.9
chair 91.7
canine 91.6
engraving 91.6
many 90.5
illustration 89.1

Imagga
created on 2019-11-10

throne 70.8
chair of state 56.7
chair 46.3
architecture 43.2
religion 35
seat 31.4
building 31.2
sculpture 30.1
art 28.9
ancient 28.6
monument 28
old 27.9
history 26.8
tourism 26.4
travel 26.1
landmark 25.3
stone 24.7
religious 24.4
church 23.1
historical 22.6
temple 21.2
historic 21.1
statue 21.1
famous 20.5
arch 20.3
culture 18.8
city 18.3
palace 18.1
cathedral 17.4
room 16.7
furniture 16.6
classroom 16.1
spirituality 15.4
god 15.3
catholic 15.1
antique 15.1
tourist 13.6
holy 13.5
gold 13.2
marble 12.6
interior 12.4
decoration 12.3
detail 12.1
golden 12
wall 12
structure 11.7
color 11.7
altar 11.4
carving 11.2
vintage 10.8
sacred 10.7
prayer 10.6
saint 10.6
catholicism 9.8
carved 9.8
heritage 9.7
architectural 9.6
spiritual 9.6
facade 9.4
traditional 9.1
style 8.9
baroque 8.8
pray 8.7
worship 8.7
faith 8.6
east 8.4
town 8.4
memorial 8.1
retro 7.4
exterior 7.4
ornate 7.3
column 7.2
holiday 7.2

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

furniture 98.1
text 97.5
old 91.9
chair 91.8
clothing 85
person 84
table 76.2
woman 67.4
piano 57.4
family 17.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-42
Gender Female, 54.9%
Fear 45%
Happy 45%
Surprised 45%
Calm 54.1%
Sad 45.7%
Angry 45%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 13-23
Gender Female, 54.4%
Angry 45.2%
Confused 45.1%
Disgusted 45.1%
Fear 45.1%
Surprised 45.1%
Calm 50.4%
Happy 47.8%
Sad 46.2%

AWS Rekognition

Age 31-47
Gender Male, 50.3%
Calm 49.7%
Surprised 49.9%
Angry 49.5%
Sad 49.8%
Fear 49.5%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 97.7%

Categories

Text analysis

Google

Gele Coues Lurent Er r d Ownehar Chssda
Gele
Coues
Lurent
Er
r
d
Ownehar
Chssda