Human Generated Data

Title

Fiddler Playing Before a House

Date

19th century

People

Artist: Léon Subercaze, French actual dates unknown; active 1845-48

Artist after: Adriaen van Ostade, Dutch 1610 - 1685

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R15167

Human Generated Data

Title

Fiddler Playing Before a House

People

Artist: Léon Subercaze, French actual dates unknown; active 1845-48

Artist after: Adriaen van Ostade, Dutch 1610 - 1685

Date

19th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-07

Art 98.7
Person 95.8
Human 95.8
Painting 90.5
Person 88.1
Person 83.9
Person 75.3
Person 72.9
Person 72.9
Person 47.8

Clarifai
created on 2019-11-07

people 99.9
print 99.7
art 99.6
illustration 99.5
adult 99.1
group 98.5
man 98.4
painting 98.3
engraving 97
woman 95.5
portrait 94.8
two 94.4
wear 93.2
leader 91.7
antique 90.9
old 88.5
woodcut 88.5
one 88.2
vintage 87.4
veil 85.7

Imagga
created on 2019-11-07

graffito 69.2
decoration 47.8
old 39.8
vintage 38.1
retro 26.2
grunge 25.6
wall 25.3
texture 23.6
antique 23.4
book jacket 21.1
aged 20.8
ancient 20.8
jacket 17.4
architecture 16.7
structure 16.5
money 15.3
paper 14.9
covering 14
detail 13.7
currency 13.5
stamp 12.8
wrapping 12.5
design 12.5
finance 11.8
memorial 11.8
art 11.8
symbol 11.5
window 11.3
building 11.2
letter 11
dirty 10.9
chalkboard 10.8
material 10.7
sketch 10.7
brass 10.6
mail 10.5
damaged 10.5
bill 10.5
black 10.2
stone 10.2
banking 10.1
historic 10.1
cash 10.1
frame 10
postage 9.8
sculpture 9.6
door 9.6
pattern 9.6
rusty 9.5
grungy 9.5
blackboard 9.5
envelope 9.4
travel 9.2
rough 9.1
business 9.1
bank 9
history 9
drawing 8.9
arch 8.9
postmark 8.9
brown 8.8
concrete 8.6
culture 8.6
facade 8.4
wallpaper 8.4
board 8.2
religion 8.1
surface 7.9
icon 7.9
postal 7.9
rich 7.5
church 7.4
representation 7.4
note 7.4
border 7.2
landmark 7.2
world 7.2
doormat 7.2
financial 7.1
country 7

Google
created on 2019-11-07

Microsoft
created on 2019-11-07

person 95.3
drawing 94.9
gallery 94.3
clothing 93
text 92.7
indoor 87.6
room 77.9
sketch 71.4
victorian 50.9
old 47.7
different 40.8
painting 15.8
picture frame 13.2

Face analysis

Amazon

AWS Rekognition

Age 19-31
Gender Male, 54.2%
Sad 46.2%
Happy 45.1%
Surprised 45.1%
Disgusted 45.1%
Calm 52.3%
Confused 45.1%
Angry 45.8%
Fear 45.3%

AWS Rekognition

Age 22-34
Gender Female, 50.3%
Calm 47.1%
Sad 48.9%
Disgusted 45.7%
Confused 45.4%
Happy 45%
Angry 45.3%
Fear 46.7%
Surprised 45.8%

AWS Rekognition

Age 24-38
Gender Male, 53.4%
Surprised 45.2%
Disgusted 45.4%
Calm 45.1%
Fear 46.7%
Happy 45.1%
Sad 46.7%
Angry 50.5%
Confused 45.2%

AWS Rekognition

Age 23-35
Gender Male, 54.7%
Angry 45.2%
Calm 54.5%
Sad 45.2%
Fear 45%
Disgusted 45%
Confused 45%
Happy 45%
Surprised 45%

AWS Rekognition

Age 24-38
Gender Female, 50.5%
Angry 45.6%
Sad 47.6%
Surprised 45%
Happy 45.1%
Confused 45%
Calm 51.4%
Disgusted 45%
Fear 45.1%

AWS Rekognition

Age 30-46
Gender Male, 54.2%
Angry 46.3%
Disgusted 45.1%
Surprised 45.5%
Happy 47%
Sad 45.6%
Confused 45%
Fear 47.1%
Calm 48.5%

AWS Rekognition

Age 33-49
Gender Male, 53.9%
Surprised 45.1%
Angry 45.3%
Happy 45%
Calm 51.1%
Disgusted 45%
Confused 45.1%
Sad 48.3%
Fear 45.2%

Feature analysis

Amazon

Person 95.8%
Painting 90.5%

Captions

Microsoft

an old photo of a person 53.4%
old photo of a person 49.2%
an old photo of a person in a room 49.1%

Text analysis

Google

reder es yeAetap A.03 ladeyort
reder
A.03
es
yeAetap
ladeyort