Human Generated Data

Title

The Virgin and Child

Date

c. 1435

People

Artist: Blaž Jurjev, c.1375 - c. 1450

Artist: Jacobello del Fiore, Italian 1394 - 1439

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Meta and Paul J. Sachs, 1965.459

Human Generated Data

Title

The Virgin and Child

People

Artist: Blaž Jurjev, c.1375 - c. 1450

Artist: Jacobello del Fiore, Italian 1394 - 1439

Date

c. 1435

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Meta and Paul J. Sachs, 1965.459

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Art 97.2
Painting 97.2
Human 95.4
Person 95.4
Archaeology 69.5
Drawing 63.4
Wall 59.5
Person 47.7

Clarifai
created on 2020-04-24

art 98.6
people 98.3
religion 97.3
old 94.9
wear 94.9
portrait 94.8
painting 93
adult 92.7
print 91.3
antique 90.5
leader 90.3
ancient 90.3
veil 89.9
monarch 89
woman 87.6
man 87.4
saint 85.6
vintage 84.9
engraving 84.8
one 83.5

Imagga
created on 2020-04-24

gravestone 100
cemetery 90.6
memorial 84.5
stone 75.1
structure 47.2
old 39.7
texture 32.7
grunge 26.4
art 26
ancient 25.1
architecture 23.4
pattern 23.3
wall 22.2
rough 21.9
antique 21.6
close 20.6
history 19.7
textured 19.3
aged 19
detail 17.7
vintage 17.4
surface 16.8
design 16.3
dirty 16.3
material 16.1
sculpture 15.3
religion 14.4
brown 14
carving 13.7
weathered 13.3
building 12.7
dark 12.5
wood 12.5
god 12.4
decoration 12
culture 12
worn 11.5
closeup 11.5
grungy 11.4
backdrop 10.7
travel 10.6
wooden 10.6
artistic 10.4
church 10.2
frame 10
face 10
backgrounds 9.7
spirituality 9.6
concrete 9.6
ornament 9.5
color 9.5
wallpaper 9.2
decorative 9.2
cement 8.7
museum 8.7
paper 8.6
damaged 8.6
old fashioned 8.6
temple 8.6
tree 8.5
exterior 8.3
outdoors 8.2
retro 8.2
border 8.1
decor 8
carved 7.8
gate 7.8
rusty 7.6
religious 7.5
east 7.5
monument 7.5
shape 7.5
light 7.4
effect 7.3
paint 7.2

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

drawing 99.4
sketch 98.7
text 98.1
painting 95.8
human face 89.1
art 81.4
person 80
old 75.9
child art 59.3
illustration 58.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 22-34
Gender Female, 72.7%
Confused 2.2%
Fear 3.9%
Disgusted 0.5%
Sad 52.8%
Calm 33.7%
Happy 1.3%
Surprised 3.2%
Angry 2.4%

AWS Rekognition

Age 39-57
Gender Female, 51.8%
Fear 45.6%
Calm 47.8%
Disgusted 45.1%
Happy 46.3%
Angry 46.3%
Sad 48.5%
Surprised 45.3%
Confused 45.1%

Microsoft Cognitive Services

Age 29
Gender Female

Feature analysis

Amazon

Painting 97.2%
Person 95.4%

Captions

Microsoft
created on 2020-04-24

an old photo of a person 71.3%
an old photo of a person 67.6%
old photo of a person 66.6%