Human Generated Data

Title

Virgin and Child

Date

18th-19th century

People

Artist: Nicolò Schiavonetti, Italian 1771 - 1813

Artist after: Peter Paul Rubens, Flemish 1577 - 1640

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3553

Human Generated Data

Title

Virgin and Child

People

Artist: Nicolò Schiavonetti, Italian 1771 - 1813

Artist after: Peter Paul Rubens, Flemish 1577 - 1640

Date

18th-19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3553

Machine Generated Data

Tags

Amazon
created on 2023-10-24

Art 100
Painting 100
Person 99.2
Face 99.2
Head 99.2
Photography 99.2
Portrait 99.2
Text 55.6

Clarifai
created on 2019-02-27

people 99.5
art 98.3
adult 96.6
portrait 96.3
painting 95.1
retro 94.5
woman 94.3
print 91
child 90.6
wear 90.6
man 90.6
antique 90.2
love 88.7
illustration 88
picture frame 87.7
two 87.6
one 87.3
affection 86.8
old 85.1
group 83

Imagga
created on 2019-02-27

sketch 46.1
drawing 33.2
statue 31.8
representation 31.2
sculpture 24.8
book jacket 23.1
art 21.8
jacket 18.9
culture 18
religion 17.9
ancient 17.3
religious 16.9
vintage 16.6
antique 14.7
portrait 14.2
baby 14
old 13.9
newspaper 13.8
wrapping 13.7
stone 13.6
face 13.5
god 13.4
symbol 12.8
marble 12.8
man 12.8
one 12.7
creation 12.3
people 12.3
product 12.3
money 11.1
historic 11
architecture 10.9
close 10.8
history 10.7
male 10.6
painter 10.4
catholic 9.7
museum 9.7
holy 9.6
saint 9.6
love 9.5
historical 9.4
monument 9.3
person 9.3
fetus 9.3
church 9.3
cash 9.2
covering 9.2
black 9
human 9
currency 9
detail 8.9
closeup 8.8
sitting 8.6
daily 8.5
famous 8.4
dollar 8.4
carving 8.2
icon 7.9
masterpiece 7.9
renaissance 7.9
paintings 7.8
model 7.8
artist 7.7
spirituality 7.7
capital 7.6
decoration 7.5
world 7.5
savings 7.5
mother 7.4
lady 7.3
business 7.3
body 7.2
bank 7.2
hair 7.1
adult 7.1

Google
created on 2019-02-27

Microsoft
created on 2019-02-27

wall 97
room 96.1
gallery 91.9
scene 89.2
old 44.9
picture frame 8.9
black and white 8.9
art 8.4
museum 7.7
monochrome 3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Calm 98.3%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%
Confused 0%

AWS Rekognition

Age 20-28
Gender Male, 100%
Angry 82.8%
Surprised 8.5%
Fear 6.2%
Calm 5.1%
Sad 2.8%
Disgusted 2.6%
Happy 2.2%
Confused 0.8%

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

paintings art 97.8%
pets animals 1.4%

Captions

Microsoft
created on 2019-02-27

an old photo of a person 76.9%
old photo of a person 72.4%
an old photo of a person in a room 72.3%

Text analysis

Amazon

AND
VIRGIN
VIRGIN AND CHILD.
CHILD.
NILLER,
CHIP
KETHILLIAN NILLER,
KERRITA
KETHILLIAN

Google

VIRGIN AND CHILD
VIRGIN
AND
CHILD