Human Generated Data

Title

Extreme Unction

Date

17th century

People

Artist: Jean Pesne, French 1623 - 1700

Artist after: Nicolas Poussin, French 1594 - 1665

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R1500

Human Generated Data

Title

Extreme Unction

People

Artist: Jean Pesne, French 1623 - 1700

Artist after: Nicolas Poussin, French 1594 - 1665

Date

17th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-05

Art 99.5
Painting 99.5
Human 95.5
Person 95.5
Person 88.2
Person 87.3
Person 85.4
Person 80.6
Person 79.6
Person 79.5
Person 74

Clarifai
created on 2019-11-05

people 100
print 99.7
group 99.6
art 99.3
adult 98.6
illustration 98
engraving 97.7
many 97.1
furniture 96.4
man 95.1
woman 94
book bindings 93.9
leader 92.3
room 90.5
administration 89.7
lithograph 89.6
seat 89
book series 88.5
book 87.7
sit 87.5

Imagga
created on 2019-11-05

sculpture 37.7
newspaper 34.3
product 27.8
art 27
old 25.8
statue 24.1
creation 23.4
monument 20.5
daily 20
ancient 19.9
architecture 19.7
history 19.7
structure 18.5
culture 17.1
vintage 16.5
fountain 16.5
money 16.2
antique 15.4
stone 15.3
memorial 15.1
carving 15
historic 14.7
landmark 14.4
tourism 14
marble 13.9
building 13.9
detail 13.7
travel 13.4
city 13.3
cash 12.8
decoration 12.6
currency 12.6
paper 12.5
historical 12.2
famous 12.1
dollar 12.1
bank 11.6
brass 11
finance 11
figure 10.9
temple 10.8
religion 10.8
face 10.7
god 10.5
sketch 10.5
one 10.4
banking 10.1
closeup 10.1
statues 9.8
retro 9.8
symbol 9.4
savings 9.3
drawing 9.3
church 9.3
decorative 9.2
aged 9
wealth 9
financial 8.9
holy 8.7
spiritual 8.6
bill 8.6
business 8.5
religious 8.4
column 8.2
design 7.9
representation 7.9
postmark 7.9
printed 7.9
stamp 7.7
dollars 7.7
facade 7.7
cathedral 7.7
pay 7.7
loan 7.7
mail 7.7
rich 7.4
close 7.4
gold 7.4
exterior 7.4
letter 7.3
museum 7.3
tourist 7.2

Google
created on 2019-11-05

Microsoft
created on 2019-11-05

building 99.2
text 98.6
person 97.2
clothing 95.8
painting 93.5
drawing 80.3
old 79.8
woman 62.6
clothes 17.7

Face analysis

Amazon

AWS Rekognition

Age 48-66
Gender Male, 50.5%
Sad 49.5%
Surprised 50.2%
Happy 49.5%
Fear 49.5%
Calm 49.6%
Disgusted 49.5%
Angry 49.6%
Confused 49.5%

AWS Rekognition

Age 24-38
Gender Male, 50.4%
Fear 49.5%
Surprised 49.5%
Disgusted 49.5%
Calm 49.6%
Angry 50.4%
Happy 49.5%
Sad 49.5%
Confused 49.5%

AWS Rekognition

Age 24-38
Gender Male, 50.5%
Disgusted 49.5%
Fear 49.5%
Angry 49.5%
Calm 50.4%
Happy 49.5%
Surprised 49.5%
Confused 49.5%
Sad 49.6%

AWS Rekognition

Age 38-56
Gender Male, 50.5%
Disgusted 49.5%
Happy 49.5%
Fear 49.5%
Confused 49.5%
Calm 50.2%
Surprised 49.5%
Sad 49.6%
Angry 49.7%

AWS Rekognition

Age 47-65
Gender Female, 50.1%
Happy 49.5%
Calm 49.6%
Fear 49.9%
Confused 49.6%
Surprised 49.6%
Angry 49.6%
Disgusted 49.5%
Sad 49.8%

AWS Rekognition

Age 27-43
Gender Male, 50.6%
Happy 45.4%
Disgusted 45%
Sad 45.3%
Calm 53.9%
Angry 45.1%
Fear 45.1%
Confused 45%
Surprised 45.1%

AWS Rekognition

Age 22-34
Gender Male, 53.4%
Happy 45%
Calm 45%
Angry 45%
Surprised 45%
Confused 45%
Disgusted 45%
Fear 45%
Sad 55%

AWS Rekognition

Age 22-34
Gender Male, 54.2%
Fear 45%
Disgusted 45.1%
Surprised 45%
Confused 45%
Happy 45.6%
Sad 45.2%
Calm 46.8%
Angry 52.2%

Feature analysis

Amazon

Painting 99.5%
Person 95.5%

Captions

Microsoft

a group of people in an old photo of a person 75.7%
a group of people standing in front of a building 75.6%
an old photo of a group of people in a room 75.5%

Text analysis

Amazon

ORENT
EVM
SVPER
EVM VNGENTLS
ORENT SVPER EVM VNGENTLS EVM O1.E0 IN NOMINE DOMINI
IN NOMINE
DOMINI
O1.E0

Google

VNGENTES
0LEO
IN
NOMINE
ORENT SVPER EVM VNGENTES EVM 0LEO IN NOMINE DOMINI
SVPER
EVM
ORENT
DOMINI