Human Generated Data

Title

Sculptor's Model for the Virgin and Child

Date

15th-17th century

People

Artist: Unidentified Artist,

Previous attribution: Andrea del Verrocchio, Italian 1435 - 1488

Classification

Sculpture

Credit Line

Harvard Art Museums/Fogg Museum, Alpheus Hyatt Purchasing Fund, 1929.225

Human Generated Data

Title

Sculptor's Model for the Virgin and Child

People

Artist: Unidentified Artist,

Previous attribution: Andrea del Verrocchio, Italian 1435 - 1488

Date

15th-17th century

Classification

Sculpture

Machine Generated Data

Tags

Imagga
created on 2018-12-17

statue 100
sculpture 39.7
monk 32.8
religion 29.6
art 24.3
dress 22.6
ancient 21.6
old 20.9
monument 20.6
history 19.7
religious 19.7
antique 17.6
catholic 17.5
god 17.2
culture 17.1
face 16.3
people 16.2
saint 15.4
faith 15.3
portrait 14.9
holy 14.5
model 14
church 13.9
stone 13.5
spiritual 13.4
attractive 13.3
person 13
lady 13
gown 12.8
marble 12.8
clothing 12.7
architecture 12.5
figure 12.3
fashion 12.1
travel 12
hair 11.9
love 11.8
pray 11.6
tourism 11.6
pretty 11.2
adult 11.1
historic 11
decoration 10.9
happy 10.7
spirituality 10.6
bride 10.6
style 10.4
historical 10.4
roman 10.1
symbol 10.1
man 10.1
city 10
mother 9.8
sacred 9.7
garment 9.7
detail 9.7
skirt 9.5
smile 9.3
wedding 9.2
worship 8.7
elegance 8.4
carving 8.1
building 7.9
couple 7.8
happiness 7.8
angel 7.8
prayer 7.7
sky 7.7
classical 7.6
studio 7.6
hand 7.6
human 7.5
traditional 7.5
famous 7.4
sensual 7.3
sensuality 7.3
temple 7.2
landmark 7.2
blond 7.2
cute 7.2
bronze 7.1

Google
created on 2018-12-17

Microsoft
created on 2018-12-17

person 98.3
curtain 95.2
indoor 93
sculpture 84.1
striped 50.4
museum 50.4
statue 31.4
black and white 13
cemetery 7.9

Face analysis

Microsoft

Google

Microsoft Cognitive Services

Age 16
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Captions

Microsoft

a person holding a baby 38.3%
a person sitting in front of a curtain 38.2%
a person holding a teddy bear 34.3%