Human Generated Data

Title

Adoration of the Infant Christ by Mary and Joseph

Date

1827

People

Artist: Caterina Piotti-Pirola, Italian ca. 1800 - after 1830

Artist after: Bernardino Luini, Italian c. 1480 - c. 1532

Classification

Prints

Human Generated Data

Title

Adoration of the Infant Christ by Mary and Joseph

People

Artist: Caterina Piotti-Pirola, Italian ca. 1800 - after 1830

Artist after: Bernardino Luini, Italian c. 1480 - c. 1532

Date

1827

Classification

Prints

Machine Generated Data

Tags

Amazon

Human 98.5
Person 98.5
Person 97.3
Painting 96.2
Art 96.2
Person 94.6
Person 80.7
Person 69.2
Photo 64.5
Photography 64.5
Portrait 64.5
Face 64.5

Clarifai

people 99.9
art 99.3
group 98.9
adult 98.6
illustration 98.5
print 98
two 97.1
portrait 97
man 96.9
child 96.8
woman 96.4
painting 94.6
one 94.4
baby 92
engraving 90.9
son 90.7
religion 90.5
three 90.1
wear 89.8
affection 87.8

Imagga

sculpture 66.6
sketch 54
statue 52.5
drawing 39
carving 37.9
art 37.8
religion 35
representation 34.7
architecture 30.5
ancient 30.3
stone 30.1
cemetery 29
old 27.2
temple 25.1
god 24.9
culture 24.8
column 24.2
history 23.3
religious 21.6
figure 20.7
detail 20.1
antique 19.9
monument 19.6
marble 19.5
church 18.5
travel 17.6
building 17.5
famous 16.8
historic 16.5
catholic 15.6
holy 15.4
decoration 15.1
spirituality 14.4
face 14.2
tourism 13.2
landmark 12.7
city 12.5
fountain 12.2
symbol 12.1
carved 11.7
pray 11.6
worship 11.6
heritage 11.6
plastic art 11.6
historical 11.3
structure 11.1
head 10.9
facade 10.6
saint 10.6
spiritual 10.6
faith 10.5
east 10.3
exterior 10.2
roman 10.1
vintage 9.9
architectural 9.6
decorative 9.2
lion 8.8
museum 8.7
prayer 8.7
golden 8.6
oriental 8.5
details 8.5
outdoor 8.4
classic 8.4
traditional 8.3
style 8.2
tourist 8.2
sand 8.1
design 8
sacred 7.8
meditation 7.7
cross 7.5
close 7.4
man 7.4
soil 7.3
portrait 7.1

Microsoft

drawing 97.9
painting 97.7
text 96.9
sketch 94.4
person 87
cartoon 86.5
human face 73.8
art 63.3
posing 61.8
baby 54.6
clothing 52
old 43.7
picture frame 19.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 17-29
Gender Female, 54.5%
Disgusted 45.2%
Angry 45.2%
Fear 45.1%
Calm 51.9%
Happy 46.9%
Sad 45.5%
Surprised 45.2%
Confused 45.1%

AWS Rekognition

Age 13-25
Gender Female, 98.4%
Confused 0%
Fear 0%
Calm 99%
Angry 0%
Surprised 0%
Sad 0.8%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 36-52
Gender Female, 51.8%
Happy 0%
Confused 0.2%
Disgusted 0%
Sad 12.8%
Angry 0.1%
Fear 0%
Calm 86.9%
Surprised 0%

AWS Rekognition

Age 17-29
Gender Female, 52.5%
Calm 54.9%
Disgusted 45%
Angry 45%
Surprised 45%
Happy 45%
Confused 45%
Fear 45%
Sad 45%

AWS Rekognition

Age 0-3
Gender Female, 50.5%
Angry 45.2%
Surprised 45.1%
Calm 53.9%
Happy 45%
Fear 45%
Disgusted 45%
Sad 45.5%
Confused 45.2%

Microsoft Cognitive Services

Age 43
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Painting 96.2%

Captions

Microsoft

a group of people posing for a photo 70.8%
a group of people posing for the camera 70.7%
a group of women posing for a photo 68.7%

Text analysis

Amazon

ADORAVIT
QUEM GENUIT ADORAVIT
GENUIT
QUEM

Google

ADORAVIT QUEM GENUIT
GENUIT
ADORAVIT
QUEM