Human Generated Data

Title

Saint Peter Taking Money from the Mouth of the Fish

Date

17th century

People

Artist: Lucas Vorsterman, I, Flemish 1595 - 1675

Artist after: Peter Paul Rubens, Flemish 1577 - 1640

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R4721

Human Generated Data

Title

Saint Peter Taking Money from the Mouth of the Fish

People

Artist: Lucas Vorsterman, I, Flemish 1595 - 1675

Artist after: Peter Paul Rubens, Flemish 1577 - 1640

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R4721

Machine Generated Data

Tags

Amazon
created on 2019-08-06

Art 99
Painting 99
Human 98.6
Person 98.6
Person 98.2
Person 92.8
Person 87
Person 76.6
Person 61.7

Clarifai
created on 2019-08-06

people 100
group 99.8
art 99.5
adult 99.4
many 98.5
print 97.7
man 96.9
woman 96.3
wear 96.3
leader 95.7
facial hair 95.3
soldier 94.6
child 94.6
administration 94.5
royalty 93.5
gown (clothing) 93.4
military 93.3
religion 92.1
painting 91
engraving 89.9

Imagga
created on 2019-08-06

sculpture 58.8
statue 58.7
art 38
religion 34.1
ancient 31.1
monument 29.9
architecture 29
history 28.6
old 27.9
religious 25.3
stone 23.3
carving 23.2
cemetery 22.6
decoration 22.5
culture 22.2
tourism 19.8
god 19.1
figure 19
antique 18.6
travel 18.3
saint 18.3
landmark 18.1
detail 17.7
city 17.5
historic 17.4
temple 17
historical 16
famous 15.8
catholic 15.6
holy 15.4
spirituality 15.4
church 14.8
building 13.5
symbol 12.8
angel 12.7
pray 12.6
spiritual 12.5
marble 12.2
vintage 11.6
museum 11.2
tourist 10.9
faith 10.5
graffito 10.4
design 10.1
roman 9.8
sketch 9.8
cathedral 9.6
sand 9.6
capital 9.5
column 9.4
exterior 9.2
carved 8.8
worship 8.7
golden 8.6
traditional 8.3
fountain 8
drawing 8
structure 7.9
classical 7.6
decorative 7.5
destination 7.5
bronze 7.5
east 7.5
representation 7.1

Google
created on 2019-08-06

Microsoft
created on 2019-08-06

outdoor 99.4
text 99.4
drawing 98.1
sketch 96.7
person 96.2
painting 91.7
clothing 91
group 81.8
old 81.6
man 81.1
white 69
posing 41.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 49-69
Gender Female, 51.2%
Happy 45.5%
Sad 46.1%
Surprised 46.2%
Angry 47.9%
Calm 47.4%
Confused 46.7%
Disgusted 45.2%

AWS Rekognition

Age 38-59
Gender Female, 94.1%
Angry 2.2%
Disgusted 1.6%
Confused 3.9%
Sad 56.7%
Calm 21.3%
Surprised 5.4%
Happy 8.8%

AWS Rekognition

Age 23-38
Gender Male, 51.8%
Surprised 46.8%
Happy 45.6%
Angry 46.9%
Disgusted 45.7%
Calm 47.6%
Sad 46.2%
Confused 46.1%

AWS Rekognition

Age 35-53
Gender Male, 79.7%
Confused 1.8%
Sad 27.3%
Happy 0.7%
Surprised 3.2%
Angry 2.5%
Calm 63.4%
Disgusted 1.1%

AWS Rekognition

Age 48-68
Gender Male, 97.5%
Angry 4.6%
Sad 47.6%
Confused 1.9%
Calm 42.5%
Happy 1.6%
Surprised 0.9%
Disgusted 1%

AWS Rekognition

Age 48-68
Gender Male, 79.6%
Angry 1.2%
Sad 89.3%
Surprised 0.4%
Confused 1.5%
Calm 7.2%
Happy 0.2%
Disgusted 0.2%

Microsoft Cognitive Services

Age 28
Gender Female

Feature analysis

Amazon

Painting 99%
Person 98.6%