Human Generated Data

Title

Two Female Heads

Date

1895 - 1916

People

Artist: John Singer Sargent, American 1856 - 1925

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Francis Ormond, 1937.11.65

Human Generated Data

Title

Two Female Heads

People

Artist: John Singer Sargent, American 1856 - 1925

Date

1895 - 1916

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Francis Ormond, 1937.11.65

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Human 98.4
Drawing 97.6
Art 97.6
Sketch 95.3
Person 93.7
Person 63.3

Clarifai
created on 2019-04-10

people 99.9
portrait 99.9
adult 99.2
art 98.4
one 98.2
man 97.5
wear 94.3
illustration 93.7
print 93.4
painting 88.7
woman 87.7
facial expression 86.6
facial hair 86
engraving 85.6
two 82.1
retro 81.8
vintage 79.8
leader 75.8
antique 74.7
hair 71.3

Imagga
created on 2019-01-29

sketch 100
drawing 100
representation 100
portrait 29.1
attractive 23.8
model 21.8
hair 21.4
face 21.3
fashion 19.6
close 19.4
pretty 18.9
money 18.7
sexy 18.5
person 18.4
adult 18.1
currency 17.1
people 16.7
cash 16.5
posing 16
makeup 15.6
skin 15.2
one 14.9
bill 14.3
banking 13.8
body 13.6
human 13.5
eyes 12.9
style 12.6
head 12.6
lady 12.2
feminine 12.2
man 12.1
expression 12
sensual 11.8
bank 11.7
black 11.4
brunette 11.3
dollar 11.2
art 11.1
paper 11
vintage 10.8
financial 10.7
look 10.5
business 10.3
economy 10.2
finance 10.2
gorgeous 10
wealth 9.9
studio 9.9
hairstyle 9.5
closeup 9.4
happy 9.4
savings 9.3
cute 9.3
lips 9.3
sensuality 9.1
lovely 8.9
banknote 8.7
dollars 8.7
old 8.4
eye 8
twenty 7.9
male 7.8
exchange 7.6
hand 7.6
erotic 7.6
cosmetics 7.5
pose 7.3
design 7.2
stylish 7.2
looking 7.2

Google
created on 2019-01-29

Sketch 97.5
Drawing 97.4
Face 96.4
Head 90.6
Portrait 86.6
Self-portrait 84
Art 82.6
Forehead 82.2
Artwork 79.7
Illustration 77.2
Visual arts 68.6
Jaw 66.8
Figure drawing 66
Painting 63.8
Black-and-white 56.4
Facial hair 55.5
Style 51

Microsoft
created on 2019-01-29

text 98.1
book 93.4
drawing 93.4
sketch 39.7
portrait 10.2
charcoal 10.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 87.6%
Angry 8.3%
Surprised 6.7%
Sad 37.3%
Happy 1.6%
Confused 5.1%
Calm 30.9%
Disgusted 10%

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.7%

Categories

Imagga

paintings art 99%

Captions

Microsoft
created on 2019-01-29

a close up of a book 39.4%
close up of a book 33.3%
a hand holding a book 33.2%