Human Generated Data

Title

Virgin and Child with a Male Donor

Date

c. 1370

People

Artist: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1923.46

Human Generated Data

Title

Virgin and Child with a Male Donor

People

Artist: Unidentified Artist,

Date

c. 1370

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1923.46

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 98
Person 98
Architecture 97.3
Building 97.3
Person 97.3
Art 93.1
Person 91.2
Person 89.2
Painting 81.5
Person 81.1
Drawing 71.1
Housing 62.4
Mansion 58.8
House 58.8
Church 56.7
Apse 55.6

Clarifai
created on 2020-04-24

people 99.8
art 98.6
religion 98.1
adult 97.9
veil 96.4
painting 96.3
group 96.3
print 95.5
man 95.3
illustration 95.3
saint 93.7
woman 91.3
one 90.6
leader 90
gown (clothing) 89.5
church 89.4
wear 89.2
kneeling 88.7
god 87.5
cross 85.7

Imagga
created on 2020-04-24

facade 100
architecture 56.3
sculpture 49.1
religion 40.4
building 36.4
church 36.1
statue 34.4
cathedral 33
history 31.4
ancient 31.2
art 31
old 30
monument 29.9
landmark 29.9
stone 28.3
historic 27.6
culture 27.4
tourism 27.3
structure 26.2
altar 24.4
famous 24.2
historical 22.6
city 21.7
arch 20.8
religious 20.6
marble 20.5
travel 20.5
god 20.1
column 19.5
temple 18.6
door 17.2
carving 16.6
exterior 15.7
antique 15.6
spirituality 15.4
catholic 14.6
holy 13.5
decoration 12.8
tourist 12.7
architectural 12.5
window 12.4
memorial 12.1
palace 12
catholicism 11.8
museum 11.8
baroque 11.7
entrance 11.6
prayer 11.6
spiritual 11.5
medieval 11.5
detail 11.3
roman 10.7
heritage 10.6
saint 10.6
faith 10.5
statues 9.9
century 9.8
worship 9.7
style 9.7
attraction 9.6
wall 9.4
traditional 9.2
gold 9.1
figure 8.9
carved 8.8
pray 8.7
ornament 8.6
fountain 8.5
vacation 8.2
symbol 8.1
basilica 7.9
praying 7.8
sacred 7.8
st 7.8
capital 7.6
buildings 7.6
details 7.6
cross 7.6
destination 7.5
stucco 7.5
vintage 7.5
ornate 7.3

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 97.9
drawing 87.1
sketch 86.3
old 80
person 75.6
church 65.4
painting 20.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 48-66
Gender Female, 51.2%
Calm 45.9%
Fear 45%
Angry 45%
Sad 54%
Surprised 45%
Disgusted 45%
Happy 45%
Confused 45%

AWS Rekognition

Age 17-29
Gender Female, 54.3%
Fear 45.1%
Angry 45.2%
Confused 45%
Disgusted 45%
Happy 45.4%
Calm 53.8%
Sad 45.4%
Surprised 45.1%

AWS Rekognition

Age 22-34
Gender Female, 92.8%
Disgusted 2%
Angry 8%
Happy 0.8%
Sad 32.7%
Confused 5.2%
Calm 31.7%
Fear 11.7%
Surprised 7.9%

AWS Rekognition

Age 45-63
Gender Male, 54.8%
Happy 45.1%
Disgusted 45%
Angry 45.1%
Calm 54%
Sad 45.7%
Fear 45%
Confused 45%
Surprised 45%

Microsoft Cognitive Services

Age 45
Gender Female

Feature analysis

Amazon

Person 98%
Painting 81.5%

Categories

Text analysis

Amazon

10