Human Generated Data

Title

Sistine Madonna

Date

1826

People

Artist: Hyacinthe Louis Victor J.-B. Aubry-Lecomte, French 1797 - 1858

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G70

Human Generated Data

Title

Sistine Madonna

People

Artist: Hyacinthe Louis Victor J.-B. Aubry-Lecomte, French 1797 - 1858

Artist after: Raphael, Italian 1483 - 1520

Date

1826

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G70

Machine Generated Data

Tags

Amazon
created on 2019-11-07

Art 99.1
Painting 99.1
Human 98.6
Person 98.6
Person 98.3
Person 97.8
Person 96.3
Person 93.7
Face 66.9
Photo 66.9
Photography 66.9
Portrait 66.9
Drawing 55.8

Clarifai
created on 2019-11-07

people 99.9
one 99
art 98.9
adult 98.9
two 98.9
print 97.9
group 97.6
portrait 97.1
man 97.1
woman 96.7
illustration 96.2
furniture 94.9
leader 93.6
painting 93.4
three 92.9
nude 92.2
religion 89.4
saint 87.8
gown (clothing) 87.7
baroque 87.3

Imagga
created on 2019-11-07

sketch 35.2
sculpture 31.5
drawing 25.4
statue 24.5
art 23.8
representation 22.9
portrait 22
religion 20.6
culture 19.7
carving 19.1
old 16.7
ancient 16.4
face 16.3
god 16.3
antique 15.8
church 15.7
vintage 14.9
fashion 14.3
history 14.3
model 14
decoration 13.9
historic 13.8
people 13.4
historical 13.2
sexy 12.9
catholic 12.6
holy 12.5
saint 12.5
architecture 12.5
man 12.5
figure 12.4
faith 11.5
religious 11.2
style 11.1
hair 11.1
adult 11
makeup 11
person 10.8
symbol 10.8
detail 10.5
love 10.3
famous 10.2
sensual 10
sensuality 10
dress 9.9
romantic 9.8
pretty 9.8
prayer 9.7
couple 9.6
monument 9.3
male 9.2
decorative 9.2
close 9.1
black 9.1
design 9
one 9
lady 8.9
book jacket 8.8
body 8.8
pray 8.7
luxury 8.6
passion 8.5
head 8.4
attractive 8.4
human 8.2
plastic art 8.1
marble 7.9
brunette 7.8
artist 7.7
spiritual 7.7
two 7.6
make 7.3
museum 7.2

Google
created on 2019-11-07

Microsoft
created on 2019-11-07

text 99.9
book 97.5
drawing 96.9
sketch 96.2
painting 93.9
indoor 90
cartoon 72.7
different 50.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 10-20
Gender Female, 54.9%
Calm 54.1%
Confused 45%
Sad 45.5%
Surprised 45%
Angry 45%
Disgusted 45%
Fear 45%
Happy 45.3%

AWS Rekognition

Age 2-8
Gender Female, 53.6%
Fear 45.1%
Sad 49.2%
Calm 49.3%
Happy 45.1%
Surprised 45%
Confused 45.1%
Disgusted 45.1%
Angry 46%

AWS Rekognition

Age 5-15
Gender Female, 54.5%
Disgusted 45%
Fear 45%
Happy 45%
Confused 45.1%
Sad 45%
Surprised 45.6%
Angry 45%
Calm 54.2%

AWS Rekognition

Age 4-14
Gender Male, 50.9%
Calm 54.4%
Confused 45%
Happy 45%
Angry 45.1%
Disgusted 45%
Fear 45%
Sad 45.5%
Surprised 45%

AWS Rekognition

Age 5-15
Gender Female, 54.9%
Sad 45.1%
Calm 53.7%
Surprised 45%
Disgusted 45%
Happy 46.2%
Fear 45%
Confused 45%
Angry 45%

AWS Rekognition

Age 36-52
Gender Male, 54.9%
Sad 45.2%
Disgusted 45.1%
Happy 45.2%
Calm 52.5%
Angry 46.6%
Confused 45%
Surprised 45.3%
Fear 45.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.1%
Person 98.6%