Human Generated Data

Title

The Recital

Date

19th century

People

Artist: Édouard Moyse, French 19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R10289

Human Generated Data

Title

The Recital

People

Artist: Édouard Moyse, French 19th century

Date

19th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Art 98.1
Painting 94.8
Human 88.9
Person 88.9
Person 87.8
Person 80.6
Person 76.5
Person 75.8

Clarifai
created on 2019-03-22

people 100
art 99.6
group 99.6
adult 99.4
print 99.1
painting 99.1
illustration 98.6
wear 97.4
man 96.6
furniture 96.5
woman 94
portrait 93.8
many 92.7
museum 92.1
child 92.1
facial hair 91.5
exhibition 90.4
room 90
two 88.7
leader 88.7

Imagga
created on 2019-03-22

brass 39.5
old 38.3
memorial 37.3
vintage 37.2
structure 33.5
grunge 30.6
antique 23.4
texture 22.9
aged 22.6
dirty 21.7
wall 20.6
art 20.3
retro 19.7
ancient 19
grungy 17.1
frame 16.6
blackboard 16.2
damaged 15.2
black 15
letter 14.7
pattern 14.3
rusty 14.3
material 14.3
design 14.2
postmark 13.8
postage 13.7
postal 13.7
border 13.6
stamp 13.5
mail 13.4
global 12.8
rough 12.7
paint 12.7
paper 12.5
envelope 12
printed 11.8
weathered 11.4
textured 11.4
decoration 11.3
circa 10.8
surface 10.6
architecture 10.1
shows 9.8
close 9.7
messy 9.7
world 9.6
rust 9.6
communications 9.6
brown 9.6
dirt 9.5
painted 9.5
post 9.5
symbol 9.4
history 8.9
detail 8.8
renaissance 8.8
museum 8.7
letters 8.7
stone 8.7
obsolete 8.6
blank 8.6
culture 8.5
unique 8.5
wallpaper 8.4
building 8.4
color 8.3
gray 8.1
drawing 8.1
zigzag 7.9
paintings 7.8
delivery 7.8
cutting 7.7
decay 7.7
television 7.7
card 7.6
worn 7.6
dark 7.5
one 7.5
backdrop 7.4
artwork 7.3
message 7.3
screen 7.2
covering 7.1
book jacket 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

room 100
scene 100
gallery 100
wall 98.6
old 93.4
posing 65.3
vintage 40.3
painting 17.1
art 17.1
museum 17
boy 13.2
man 12.9
monochrome 10.4
black and white 7.4

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.5%
Confused 45.4%
Calm 49.4%
Disgusted 45.1%
Surprised 45.3%
Sad 49.1%
Angry 45.6%
Happy 45.1%

AWS Rekognition

Age 23-38
Gender Male, 50.7%
Calm 45.8%
Disgusted 45.6%
Surprised 45.2%
Sad 49.4%
Angry 45.7%
Happy 48.1%
Confused 45.2%

AWS Rekognition

Age 35-52
Gender Male, 54.1%
Calm 50.3%
Disgusted 45.1%
Confused 45.2%
Happy 45.1%
Sad 48.7%
Surprised 45.2%
Angry 45.4%

AWS Rekognition

Age 23-38
Gender Female, 53.1%
Sad 50.3%
Calm 48.5%
Happy 45.5%
Angry 45.3%
Disgusted 45.1%
Surprised 45.1%
Confused 45.2%

AWS Rekognition

Age 60-90
Gender Female, 50%
Sad 47.5%
Angry 45.5%
Calm 48.8%
Surprised 45.9%
Confused 45.4%
Happy 45.5%
Disgusted 46.5%

Feature analysis

Amazon

Painting 94.8%
Person 88.9%

Captions

Microsoft

a vintage photo of a person 78.1%
a vintage photo of a person in a room 78%
a vintage photo of a person in a white room 68.5%