Human Generated Data

Title

Study for Seated Angel at Left, "Israel and the Law," Boston Public Library

Date

1895 - 1916

People

Artist: John Singer Sargent, American 1856 - 1925

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Francis Ormond, 1937.11.30

Human Generated Data

Title

Study for Seated Angel at Left, "Israel and the Law," Boston Public Library

People

Artist: John Singer Sargent, American 1856 - 1925

Date

1895 - 1916

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Francis Ormond, 1937.11.30

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Human 98.2
Drawing 97.9
Art 97.9
Sketch 94.4
Person 87.7
Painting 69.2

Clarifai
created on 2019-04-10

people 99.9
print 99.7
portrait 99.5
one 99.4
illustration 99.1
art 99
adult 98.8
wear 98.3
engraving 97.9
man 97.4
leader 93.6
painting 88.8
facial hair 88.5
chalk out 87.8
retro 85.3
vintage 84.8
veil 84
music 79.7
woodcut 77.1
etching 75.8

Imagga
created on 2019-01-18

sketch 100
drawing 100
representation 100
art 24.1
vintage 21.5
antique 20.8
retro 20.5
grunge 19.6
old 19.5
ancient 17.3
style 17.1
design 15.7
pattern 15.1
texture 14.6
paper 14.1
graphic 12.4
black 12
silhouette 11.6
ornament 11.2
body 11.2
floral 11.1
artwork 11
decoration 10.9
painting 10.8
frame 10.8
stamp 10.7
detail 10.5
shape 10.4
artistic 10.4
symbol 10.1
letter 10.1
man 10.1
religion 9.9
color 9.5
tattoo 9.3
flower 9.2
decorative 9.2
dirty 9
people 8.9
backgrounds 8.9
postmark 8.9
postage 8.9
mail 8.6
paint 8.2
aged 8.1
history 8.1
postal 7.9
face 7.8
scroll 7.6
religious 7.5
elements 7.4
church 7.4
portrait 7.1
cool 7.1
creative 7.1
modern 7
attractive 7

Google
created on 2019-01-18

figure drawing 93.1
art 88.1
drawing 85.6
standing 80.1
portrait 76.9
sketch 76
artwork 71.2
arm 67.6
self portrait 62.9
visual arts 59.2
painting 59
art model 53.2
chest 50.6

Microsoft
created on 2019-01-18

text 99.8
book 99.1
drawing 99.1
sketch 70
charcoal 14.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-66
Gender Female, 55.9%
Happy 14.3%
Confused 11.1%
Calm 11.8%
Surprised 7.7%
Disgusted 1.5%
Sad 50.7%
Angry 2.9%

AWS Rekognition

Age 23-38
Gender Female, 56.9%
Disgusted 5.2%
Sad 6.8%
Angry 6.1%
Surprised 13.3%
Happy 4.6%
Calm 58.6%
Confused 5.3%

Feature analysis

Amazon

Person 87.7%
Painting 69.2%

Categories

Imagga

paintings art 99.1%

Captions

Microsoft
created on 2019-01-18

a close up of a book 57.2%
close up of a book 51.2%
a hand holding a book 51.1%