Human Generated Data

Title

The Virgin and Child

Date

c. 1430

People

Artist: Unidentified Artist,

Classification

Sculpture

Human Generated Data

Title

The Virgin and Child

People

Artist: Unidentified Artist,

Date

c. 1430

Classification

Sculpture

Machine Generated Data

Tags

Amazon

Figurine 98.1
Sculpture 97.4
Art 97.4
Statue 91
Person 85.6
Human 85.6
Archaeology 62.7
Bronze 56

Clarifai

sculpture 99.7
art 99.3
religion 98.6
statue 98.6
one 97.4
no person 96.6
god 95.1
saint 94.7
figurine 92.9
veil 92.8
gold 91.2
spirituality 90.9
painting 89.8
baroque 89.7
wear 89.2
ancient 89.2
crown 88.7
Mary 88.2
Renaissance 87.7
metalwork 86.7

Imagga

statue 100
sculpture 50
religion 34.1
art 29.5
costume 26.6
culture 23.9
religious 22.5
old 22.3
god 22
history 21.5
carving 20.4
saint 20.2
ancient 19
catholic 17.5
dress 17.2
antique 16.6
figure 15.5
spiritual 15.4
faith 15.3
stone 15.2
traditional 15
face 14.9
person 14.2
monument 14
detail 13.7
clothing 13.4
architecture 13.3
temple 12.7
pray 12.6
church 12
style 11.9
pedestal 11.6
roman 11.4
fashion 11.3
portrait 11
people 10.6
holy 10.6
spirituality 10.6
lady 10.6
adult 10.4
tradition 10.2
model 10.1
symbol 10.1
historic 10.1
tourism 9.9
travel 9.9
man 9.4
historical 9.4
city 9.2
plastic art 9.1
dinner dress 9
gold 9
support 8.8
building 8.7
clothes 8.4
east 8.4
pretty 8.4
studio 8.4
peace 8.2
body 8
virgin 7.9
cultural 7.8
cathedral 7.7
ethnic 7.6

Google

Face analysis

Amazon

Google

AWS Rekognition

Age 35-52
Gender Female, 66.5%
Surprised 6.7%
Disgusted 3.2%
Confused 9%
Sad 17%
Angry 8.4%
Happy 13%
Calm 42.7%

AWS Rekognition

Age 26-43
Gender Female, 99.7%
Disgusted 4.8%
Calm 14.7%
Happy 15.1%
Angry 11.2%
Sad 10.1%
Confused 23.2%
Surprised 20.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 85.6%

Captions

Microsoft

a person wearing a costume 68.8%
a person wearing a costume 68.7%
a cat wearing a costume 30.5%