Human Generated Data

Title

Mother and Child

Date

1903

People

Artist: William Orpen, British 1878 - 1931

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Grenville L. Winthrop, Class of 1886, 1942.202

Human Generated Data

Title

Mother and Child

People

Artist: William Orpen, British 1878 - 1931

Date

1903

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Grenville L. Winthrop, Class of 1886, 1942.202

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Painting 98.9
Art 98.9
Human 82.9
Person 76.2
Drawing 55.9
Photo 55.8
Photography 55.8

Clarifai
created on 2020-05-02

art 99.9
people 99.8
painting 99.7
print 99.2
adult 98.6
illustration 98.4
engraving 97.9
man 97.9
woman 97.3
portrait 96.6
visuals 96.4
Renaissance 96.1
saint 95.6
two 95.6
baroque 94.9
religion 94
one 93.6
antique 92
nude 91.6
vintage 89.3

Imagga
created on 2020-05-02

sketch 100
drawing 88.3
representation 75.2
hair 18.2
sculpture 17.8
statue 17.7
art 16.5
portrait 16.2
one 14.9
people 13.9
face 13.5
model 13.2
person 12
close 12
money 11.9
attractive 11.9
human 11.2
sexy 11.2
skin 11
cash 11
architecture 10.9
man 10.8
body 10.4
fountain 10.2
banking 10.1
pretty 9.8
ancient 9.5
culture 9.4
old 9.1
adult 9.1
black 9
currency 9
marble 8.8
book jacket 8.8
love 8.7
antique 8.7
dollar 8.4
lady 8.1
fantasy 8.1
wealth 8.1
bank 8.1
bill 7.6
head 7.6
fashion 7.5
vintage 7.4
symbol 7.4
historic 7.3
sensuality 7.3
financial 7.1

Google
created on 2020-05-02

Microsoft
created on 2020-05-02

sketch 99.8
drawing 99.7
text 99.7
book 99.2
painting 98.3
art 89.1
child art 75.3
cartoon 55.7
human face 55.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Female, 73%
Confused 5.9%
Surprised 2.4%
Sad 6.2%
Disgusted 1.2%
Fear 4.9%
Calm 12.3%
Angry 6.2%
Happy 60.9%

AWS Rekognition

Age 45-63
Gender Female, 87.9%
Fear 0.6%
Sad 5.3%
Confused 0.5%
Calm 8.6%
Disgusted 0.4%
Happy 83%
Surprised 0.5%
Angry 1.1%

AWS Rekognition

Age 20-32
Gender Male, 84.6%
Confused 0.8%
Sad 68.4%
Disgusted 0.3%
Fear 20.2%
Surprised 1%
Happy 1.4%
Angry 1%
Calm 6.9%

Feature analysis

Amazon

Painting 98.9%
Person 76.2%

Categories

Captions

Microsoft
created on 2020-05-02

an old photo of a person 71%
a person holding a book 25.2%

Text analysis

Amazon

1903
ORPEN 1903
ORPEN

Google

ORPAN 1903
ORPAN
1903