Human Generated Data

Title

Virgin and Child

Date

c. 1850

People

Artist: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Philip M. Lydig, 1930.45

Human Generated Data

Title

Virgin and Child

People

Artist: Unidentified Artist,

Date

c. 1850

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Philip M. Lydig, 1930.45

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Clothing 99.6
Helmet 99.6
Apparel 99.6
Art 96.8
Person 96.3
Human 96.3
Drawing 91.9
Painting 89.5
Face 83.7
Sketch 74.9
Photo 68.7
Photography 68.7
Portrait 68.7

Clarifai
created on 2020-04-24

people 99.7
art 99.6
illustration 98.8
print 98.5
portrait 97.6
one 97.5
adult 97.3
painting 97
veil 95.5
leader 95.3
man 93.9
facial hair 93.9
religion 93.6
wear 92.3
engraving 91.4
old 91.4
saint 91
aura 85
gown (clothing) 83.8
visuals 81.5

Imagga
created on 2020-04-24

statue 47.9
sculpture 42.4
temple 39.7
religion 35.9
brass 33.9
memorial 33.2
art 32.4
ancient 32
culture 29.1
structure 27.5
architecture 27.1
god 24.9
stone 24.2
cemetery 22.8
history 22.4
old 21.6
carving 21.3
religious 18.8
figure 17.1
travel 16.9
monument 16.8
holy 16.4
face 16.4
historical 16
painter 15.9
church 15.7
spirituality 15.4
decoration 15
famous 14.9
historic 14.7
building 14.5
antique 13
golden 12.9
museum 12.6
spiritual 12.5
tourism 12.4
city 11.7
column 11.1
close 10.9
pray 10.7
worship 10.6
meditation 10.6
portrait 10.4
east 10.3
symbol 10.1
head 10.1
decorative 10
gold 9.9
carved 9.8
catholic 9.7
one 9.7
money 9.4
vintage 9.1
detail 8.9
roman 8.8
sacred 8.8
marble 8.7
icon 8.7
prayer 8.7
faith 8.6
tourist 8.2
landmark 8.1
currency 8.1
ruler 8
belief 7.8
monk 7.7
oriental 7.6
traditional 7.5
dollar 7.4
closeup 7.4
man 7.4
banking 7.4
cash 7.3
relief 7.1
fountain 7.1
facade 7.1

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

drawing 99.4
sketch 99
painting 98.5
text 97.8
human face 94.3
book 90.7
person 85.4
cartoon 82.8
clothing 79.4
black and white 68
art 66

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Male, 80.6%
Calm 87.7%
Disgusted 1.8%
Sad 4.5%
Surprised 0.9%
Confused 2.2%
Angry 1.5%
Fear 0.6%
Happy 0.9%

AWS Rekognition

Age 16-28
Gender Female, 93.8%
Happy 2.4%
Fear 36.7%
Calm 15.9%
Disgusted 0.4%
Confused 3%
Surprised 26%
Angry 4.4%
Sad 11.2%

Feature analysis

Amazon

Helmet 99.6%
Person 96.3%

Categories

Captions

Microsoft
created on 2020-04-24

a black and white photo of a man 73.4%
an old photo of a man 73.3%
a man holding a book 37.2%

Text analysis

Amazon

AVAV
TOF
ON
TOF GI
KAdHruino
GI
HP.

Google

YAVAV NAVAVAVAVAYA NAVA ZVMAVAVAV KALHAUN MAVAVAVAVAVAN
NAVAVAVAVAYA
YAVAV
NAVA
ZVMAVAVAV
KALHAUN
MAVAVAVAVAVAN