Human Generated Data

Title

The Virgin and Child

Date

15th century

People

Artist: Master of the Pomegranate, Italian active c. 1450 - 1475

Previous attribution: Pesellino (Francesco di Stefano), Italian 1422 - 1457

Classification

Paintings

Human Generated Data

Title

The Virgin and Child

People

Artist: Master of the Pomegranate, Italian active c. 1450 - 1475

Previous attribution: Pesellino (Francesco di Stefano), Italian 1422 - 1457

Date

15th century

Classification

Paintings

Machine Generated Data

Tags

Amazon

Art 99.3
Painting 99.3
Human 84.6
Person 84.6
Archangel 57.1
Angel 57.1

Clarifai

art 99.3
painting 99.2
people 97.2
saint 97.1
religion 97.1
Renaissance 95.9
one 95.7
Mary 95
illustration 93.2
old 91.8
baby 91.8
aura 90.8
no person 90.6
print 89.6
god 88.8
church 88.2
book 87.7
adult 87.7
ancient 87.2
antique 86.9

Imagga

art 34.2
religion 31.4
sculpture 27.8
prayer rug 26.5
temple 24.6
carving 23.1
furnishing 21.6
rug 21.1
god 20.1
old 19.5
religious 18.7
culture 17.9
ancient 17.3
church 16.6
mosaic 16.2
floor cover 16.1
architecture 15.7
travel 15.5
statue 15.2
throne 14.6
decoration 14.5
holy 14.4
antique 13.9
golden 13.8
covering 13.7
close 13.1
furniture 12.9
face 12.8
style 12.6
pray 12.6
prayer 12.5
portrait 12.3
chair of state 11.7
history 11.6
spiritual 11.5
gold 11.5
interior 11.5
faith 11.5
man 10.7
vintage 10.7
sacred 10.7
museum 10.7
spirituality 10.6
plastic art 10.5
cradle 10.5
detail 10.5
icon 10.3
chair 10.3
east 10.3
wall 10.3
baby bed 10.2
masterpiece 9.9
bible 9.8
one 9.7
design 9.6
oriental 9.4
historical 9.4
male 9.3
plaything 9.3
stone 9.3
teddy 9.2
painter 9.2
decorative 9.2
people 8.9
person 8.9
symbol 8.7
palace 8.7
artist 8.7
saint 8.7
figure 8.6
child 8.5
wicker 8.3
historic 8.3
seat 7.9
belief 7.8
meditation 7.7
fashion 7.5
pattern 7.5
monument 7.5
famous 7.4
tourism 7.4
peace 7.3
color 7.2
building 7.1

Google

Microsoft

text 94.2
book 90.7
picture frame 9.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Female, 77.1%
Disgusted 82.1%
Sad 4.6%
Calm 4.3%
Surprised 1.3%
Happy 0.5%
Confused 3.1%
Angry 4.2%

AWS Rekognition

Age 20-38
Gender Female, 97.6%
Calm 12.2%
Disgusted 22.8%
Sad 31.5%
Angry 12.6%
Confused 8.8%
Surprised 7%
Happy 5%

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.3%
Person 84.6%

Captions

Microsoft

a teddy bear sitting on top of a book 16.6%
a teddy bear holding a book 16.5%
a teddy bear sitting next to a book 13%

Text analysis

Amazon

AA