Human Generated Data

Title

Plate 44: Two Groups of the Virgin and Child

Date

17th century

People

Artist: Jan de Bisschop, Dutch 1628 - 1671

Artist after: Giulio Romano, Italian 1499? - 1546

Artist after: Parmigianino, Italian 1503 - 1540

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1279

Human Generated Data

Title

Plate 44: Two Groups of the Virgin and Child

People

Artist: Jan de Bisschop, Dutch 1628 - 1671

Artist after: Giulio Romano, Italian 1499? - 1546

Artist after: Parmigianino, Italian 1503 - 1540

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G1279

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Art 96.1
Human 96
Drawing 93.2
Person 91.5
Painting 91.4
Person 90.5
Person 89.4
Person 88.4
Sketch 85.6

Clarifai
created on 2019-10-30

people 99.8
art 99.7
adult 99.4
group 98.8
man 98.5
two 98.5
painting 97.8
illustration 97.4
wear 96.3
print 96.2
veil 95.4
one 95
woman 94.5
nude 93.8
Renaissance 93.6
interaction 92.8
three 92.2
facial hair 90.8
kneeling 89.4
allegory 89.2

Imagga
created on 2019-10-30

graffito 100
decoration 88
sketch 45.6
drawing 43.2
vintage 31.5
art 30
representation 27.4
grunge 26.4
antique 26.1
old 25.1
paper 23.5
ancient 23.4
retro 23
texture 20.2
pattern 19.8
design 17.3
currency 15.3
frame 15
money 14.5
finance 14.4
cash 13.7
painting 13.6
flower 13.1
floral 12.8
textured 12.3
decorative 11.7
history 11.6
financial 11.6
close 11.4
artistic 11.3
stamp 11.2
bank 10.8
banking 10.1
aged 10
sepia 9.7
detail 9.7
style 9.6
old fashioned 9.5
wall 9.4
culture 9.4
nobody 9.3
dollar 9.3
letter 9.2
travel 9.2
border 9.1
religion 9
parchment 8.6
mail 8.6
temple 8.6
business 8.5
wallpaper 8.4
artwork 8.2
dirty 8.1
brown 8.1
symbol 8.1
wealth 8.1
material 8
postmark 7.9
holiday 7.9
postage 7.9
postal 7.9
architecture 7.8
color 7.8
leaf 7.8
banknote 7.8
dollars 7.7
worn 7.6
rich 7.4
page 7.4
backdrop 7.4
economy 7.4
closeup 7.4
note 7.4
ornate 7.3
backgrounds 7.3
graphic 7.3
global 7.3
paint 7.2
black 7.2

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

text 99.8
sketch 99.6
drawing 99.3
book 98
cartoon 82
human face 81.2
person 63.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-65
Gender Male, 62.6%
Sad 0.3%
Fear 0%
Calm 98.6%
Disgusted 0%
Happy 0.1%
Angry 0.1%
Surprised 0.7%
Confused 0.1%

AWS Rekognition

Age 22-34
Gender Male, 78.8%
Calm 79.7%
Angry 13.1%
Fear 0.8%
Confused 0.6%
Sad 2.8%
Disgusted 0.8%
Happy 0.7%
Surprised 1.6%

AWS Rekognition

Age 22-34
Gender Male, 53.4%
Happy 45%
Sad 45.1%
Angry 45%
Disgusted 45.1%
Surprised 47.8%
Calm 51.9%
Fear 45.1%
Confused 45%

AWS Rekognition

Age 21-33
Gender Male, 61.1%
Disgusted 0.4%
Angry 71.7%
Confused 0.2%
Sad 1.2%
Happy 0.2%
Fear 0%
Surprised 0.1%
Calm 26.1%

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 91.5%
Painting 91.4%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2019-10-30

a close up of a book 39%
close up of a book 34%
a hand holding a book 33.9%

Text analysis

Amazon

Iho
Iho Rem
Rem
Sare
Sare anmeeiane
anmeeiane

Google

Iaho Rem in Parmeopiane ins.
Iaho
Rem
in
Parmeopiane
ins.