Human Generated Data

Title

Madonna and Child on the Clouds

Date

16th-17th century

People

Artist: Vespasiano Strada, Italian c. 1582 - 1622

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Francis H. Burr Memorial Fund, S6.40.3

Human Generated Data

Title

Madonna and Child on the Clouds

People

Artist: Vespasiano Strada, Italian c. 1582 - 1622

Date

16th-17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Francis H. Burr Memorial Fund, S6.40.3

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Art 96.6
Painting 88.7
Human 80.4
Person 80.4
Person 52.2

Clarifai
created on 2018-04-19

people 100
art 99.8
print 99.8
illustration 99.7
engraving 99
adult 97.5
painting 96.4
man 96
portrait 95.7
Renaissance 95.4
royalty 93.9
leader 92.2
antique 92.1
one 90.6
etching 90.4
veil 87.8
group 87.4
woman 87.3
writer 86.3
baby 84.1

Imagga
created on 2018-04-19

sketch 59.1
drawing 45.8
representation 38
art 31.4
currency 27
money 26.4
vintage 24
cash 22
mosaic 21.7
finance 20.3
old 20.2
paper 19.7
ancient 19.1
bank 18.8
financial 18.7
banking 18.4
dollar 17.7
antique 17.1
grunge 17.1
stamp 16.7
retro 16.4
religion 16.2
close 15.4
wealth 15.3
design 15.3
pattern 15.1
tile 14.9
business 14.6
history 14.3
rich 14
mail 13.4
savings 13.1
economy 13
decoration 12.9
culture 12.8
postmark 12.8
texture 12.5
exchange 12.4
church 12
postage 11.8
banknotes 11.8
dollars 11.6
pay 11.5
temple 11.4
note 11
letter 11
investment 11
postal 10.8
arabesque 10.8
bills 10.7
banknote 10.7
bill 10.5
traditional 10
us 9.7
loan 9.6
closeup 9.4
envelope 9.4
religious 9.4
travel 9.2
gold 9.1
one 9
funds 8.8
carved 8.8
artistic 8.7
architecture 8.6
post 8.6
capital 8.6
face 8.5
tattoo 8.4
decorative 8.4
painting 8.1
collection 8.1
carving 8.1
symbol 8.1
detail 8.1
market 8
philately 7.9
century 7.8
black 7.8
golden 7.7
payment 7.7
spirituality 7.7
profit 7.7
god 7.7
transducer 7.6
global 7.3
aged 7.3

Google
created on 2018-04-19

Microsoft
created on 2018-04-19

text 100
book 99.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-38
Gender Male, 54.1%
Sad 45.7%
Calm 53.9%
Angry 45.1%
Confused 45.1%
Disgusted 45.1%
Happy 45%
Surprised 45.1%

AWS Rekognition

Age 45-63
Gender Female, 51.7%
Disgusted 45.3%
Happy 52.5%
Calm 45.3%
Sad 46.4%
Surprised 45.1%
Angry 45.2%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Happy 45.2%
Angry 45.2%
Disgusted 45%
Calm 52.7%
Surprised 45.1%
Sad 46.8%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 51.7%
Disgusted 1.3%
Angry 5.3%
Happy 0.8%
Calm 16.5%
Surprised 2.3%
Sad 70.8%
Confused 3.2%

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Angry 45.3%
Disgusted 45.5%
Happy 45.2%
Surprised 45.4%
Calm 52%
Confused 45.4%
Sad 46.3%

AWS Rekognition

Age 30-47
Gender Female, 64.9%
Angry 4.5%
Happy 1.3%
Calm 15.4%
Surprised 2.8%
Sad 71.9%
Disgusted 0.9%
Confused 3.2%

AWS Rekognition

Age 15-25
Gender Male, 54.1%
Sad 48.5%
Calm 47.7%
Confused 45.9%
Angry 45.7%
Happy 45.7%
Disgusted 45.8%
Surprised 45.8%

Microsoft Cognitive Services

Age 55
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Painting 88.7%
Person 80.4%

Categories

Captions

Microsoft
created on 2018-04-19

a close up of a book 62.2%
close up of a book 57.1%
a hand holding a book 57%

Text analysis

Amazon

VESPASIANVS.S-LF: