Human Generated Data

Title

Virgin and Child (modern pastiche)

Date

late 19th-early 20th century

People

Artist: Unidentified Artist,

Previous attribution: Sano di Pietro (Ansano di Pietro di Mencio), Italian 1405 - 1481

Classification

Paintings

Human Generated Data

Title

Virgin and Child (modern pastiche)

People

Artist: Unidentified Artist,

Previous attribution: Sano di Pietro (Ansano di Pietro di Mencio), Italian 1405 - 1481

Date

late 19th-early 20th century

Classification

Paintings

Machine Generated Data

Tags

Amazon

Human 97.9
Person 97.9
Art 95.6
Drawing 79.1
Face 77.5
Photo 67.5
Portrait 67.5
Photography 67.5
Painting 65.9
Text 58.8

Clarifai

people 98.7
art 98.5
old 93.9
painting 93.7
one 93.3
religion 91.4
adult 90.9
portrait 90
print 88.2
illustration 88.1
wear 87
antique 86.9
retro 86.6
woman 85.3
vintage 82.5
man 82.1
ancient 79.4
veil 78
aura 77
monarch 74.8

Imagga

knocker 38.6
device 34.3
religion 31.4
art 30.6
old 29.3
architecture 26.8
ancient 25.9
culture 23.1
temple 22.7
antique 21.6
history 21.5
stone 19.5
texture 19.4
vintage 18.2
wall 18.1
detail 17.7
door 17.1
hole 16.9
travel 16.9
church 16.6
god 16.3
sculpture 16.2
building 15.9
cell 14.7
structure 14.1
carving 13.7
spirituality 13.4
design 12.9
icon 12.7
decoration 12.6
statue 12.4
brass 12.3
memorial 12.2
religious 12.2
symbol 12.1
famous 12.1
grunge 11.9
museum 11.6
pattern 11.6
monument 11.2
historic 11
paper 11
masterpiece 10.9
close 10.8
painter 10.8
century 10.8
tourism 10.7
face 10.7
safe 10.6
artist 10.6
holy 10.6
mosaic 10.5
one 10.5
golden 10.3
carved 9.8
book 9.3
dark 9.2
letter 9.2
traditional 9.1
gold 9
currency 9
entrance 8.7
ornament 8.6
painted 8.6
house 8.4
east 8.4
frame 8.3
metal 8
window 8
textured 7.9
southeast 7.9
bible 7.8
orthodox 7.8
gate 7.8
heritage 7.7
tile 7.7
construction 7.7
middle 7.7
cathedral 7.7
money 7.7
weathered 7.6
historical 7.5
decorative 7.5
wood 7.5
style 7.4
closeup 7.4
stucco 7.2
aged 7.2
black 7.2
strongbox 7.2
material 7.1
financial 7.1
wooden 7

Microsoft

text 99.1
drawing 97.8
human face 92.4
painting 91.3
old 90.5
sketch 90.2
person 89.8
clothing 73.9
dirty 11.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-39
Gender Female, 50.6%
Sad 27%
Surprised 0.2%
Fear 0.1%
Happy 0.4%
Angry 0.4%
Calm 71.4%
Disgusted 0.1%
Confused 0.4%

AWS Rekognition

Age 36-54
Gender Female, 91.2%
Calm 4.8%
Fear 22.4%
Angry 3.5%
Sad 3.7%
Surprised 60.1%
Disgusted 0.4%
Happy 2.6%
Confused 2.6%

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Painting 65.9%

Captions

Microsoft

a vintage photo of an old building 50.6%

Text analysis

Amazon

wmem