Human Generated Data

Title

Sistine Madonna

Date

18th-19th century

People

Artist: Johann Friedrich Wilhelm Müller, German 1782 - 1816

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2858

Human Generated Data

Title

Sistine Madonna

People

Artist: Johann Friedrich Wilhelm Müller, German 1782 - 1816

Artist after: Raphael, Italian 1483 - 1520

Date

18th-19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2858

Machine Generated Data

Tags

Amazon
created on 2019-11-06

Art 99.5
Painting 99.5
Human 98.7
Person 98.7
Person 98.2
Person 97.9
Person 85.7
Person 78.2
Archangel 57.1
Angel 57.1

Clarifai
created on 2019-11-06

people 99.9
group 99.3
art 99.3
adult 98.9
woman 97.6
religion 96.8
two 96.6
print 96.3
man 95.9
illustration 95.8
saint 95.7
three 95.5
baby 94.4
god 93
Renaissance 92.7
furniture 92.7
kneeling 92.4
child 92.4
book 92
veil 91.5

Imagga
created on 2019-11-06

statue 71.2
column 65
sculpture 64.6
religion 40.4
art 35.1
ancient 34.7
architecture 30.6
carving 27.5
religious 27.2
stone 26.6
old 26.5
monument 26.2
history 26
god 25.9
culture 22.3
figure 22.2
temple 21.7
marble 21.2
antique 20.1
tourism 19.8
travel 19.8
catholic 19.5
holy 19.3
church 17.6
building 17.5
cemetery 17.1
detail 16.9
historic 16.5
roman 16.4
historical 16
city 15.8
carved 15.7
spiritual 15.4
spirituality 15.4
decoration 14.9
famous 14.9
faith 14.4
pray 13.6
saint 13.5
worship 12.6
landmark 11.8
symbol 11.5
exterior 11.1
sketch 10.8
angel 10.8
plastic art 10.7
heritage 10.6
fountain 10.6
tourist 10
vintage 9.9
sculptures 9.9
face 9.3
decorative 9.2
statues 8.9
prayer 8.7
meditation 8.6
cross 8.5
design 8.3
traditional 8.3
divine 7.9
holiday 7.9
figurine 7.9
drawing 7.8
golden 7.8
facade 7.7
outdoor 7.7
head 7.6
tradition 7.4
artwork 7.3
museum 7.2
structure 7.1

Google
created on 2019-11-06

Microsoft
created on 2019-11-06

text 100
book 99.2
sketch 93.7
drawing 93.5
painting 76.6
woman 75.7
person 68.6
human face 67.3
old 44.3
clothes 16.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 15-27
Gender Female, 99%
Happy 0.3%
Sad 0.2%
Confused 0.1%
Angry 0%
Surprised 0.1%
Fear 0%
Calm 99.2%
Disgusted 0.1%

AWS Rekognition

Age 4-12
Gender Female, 53.7%
Happy 45%
Sad 46.9%
Disgusted 45.1%
Angry 50.1%
Calm 46.5%
Fear 45.6%
Surprised 45.2%
Confused 45.5%

AWS Rekognition

Age 3-11
Gender Female, 52.4%
Sad 45.1%
Confused 45.1%
Fear 45%
Happy 45.1%
Angry 46.8%
Disgusted 45.1%
Calm 52.4%
Surprised 45.3%

AWS Rekognition

Age 1-7
Gender Female, 50.6%
Disgusted 45%
Fear 45%
Sad 45.2%
Happy 45%
Surprised 45%
Angry 45.5%
Calm 54.3%
Confused 45%

AWS Rekognition

Age 24-38
Gender Female, 96.8%
Fear 0%
Surprised 0%
Calm 98.4%
Happy 1%
Angry 0.1%
Disgusted 0%
Confused 0%
Sad 0.3%

AWS Rekognition

Age 43-61
Gender Male, 50.6%
Fear 45.1%
Confused 45.9%
Happy 46.9%
Angry 50.7%
Disgusted 45.3%
Surprised 45.1%
Calm 45.6%
Sad 45.3%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.5%
Person 98.7%

Categories

Imagga

paintings art 97.4%
people portraits 2.3%