Human Generated Data

Title

Sibylla Erythraea

Date

18th century

People

Artist: Giovanni Volpato, Italian 1735 - 1803

Artist after: Michelangelo Buonarroti, Italian 1475 - 1564

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3974

Human Generated Data

Title

Sibylla Erythraea

People

Artist: Giovanni Volpato, Italian 1735 - 1803

Artist after: Michelangelo Buonarroti, Italian 1475 - 1564

Date

18th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3974

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Person 97.3
Human 97.3
Art 93.5
Painting 92.7
Drawing 61.1
Photography 58.2
Photo 58.2
Archaeology 56.3

Clarifai
created on 2019-10-29

people 99.7
adult 98.7
one 98.4
art 96.9
leader 95.3
man 94.6
portrait 93.3
woman 92.1
administration 90.8
writer 88.8
sit 86
print 83.7
two 81.9
illustration 81.6
book series 81.6
furniture 81
wear 80.9
no person 78.3
gown (clothing) 77.7
chair 77.5

Imagga
created on 2019-10-29

statue 50.2
sculpture 40.2
monk 28.6
religion 26.9
art 24.6
ancient 22.5
old 21.6
sketch 20.6
god 20.1
religious 19.7
culture 19.7
history 18.8
architecture 18
church 16.7
antique 16.5
famous 15.8
monument 15
drawing 14.9
representation 14.4
face 14.2
historical 14.1
catholic 13.6
portrait 13.6
stone 13.5
man 13.4
person 13
historic 12.8
pray 12.6
saint 12.5
city 12.5
people 12.3
detail 12.1
model 11.7
temple 11.6
holy 11.6
tourism 11.6
spiritual 11.5
marble 11.5
lady 11.4
column 11.3
travel 11.3
figure 11
attractive 10.5
building 10.3
love 10.3
decoration 10.2
landmark 9.9
vintage 9.9
one 9.7
adult 9.7
spirituality 9.6
faith 9.6
male 9.2
roman 9
sexy 8.8
hair 8.7
worship 8.7
cathedral 8.6
mother 8.3
carving 8.2
symbol 8.1
sauna 8
looking 8
body 8
product 7.9
angel 7.8
sacred 7.8
sitting 7.7
prayer 7.7
creation 7.6
fashion 7.5
book jacket 7.3
newspaper 7.3

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 99.8
book 90.7
person 88.6
drawing 79.2
clothing 79
painting 77.3
art 74.5
museum 74
old 63.8
sketch 54.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 7-17
Gender Female, 97.2%
Sad 7%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%
Surprised 0%
Calm 91.4%
Confused 0.2%
Happy 1%

AWS Rekognition

Age 33-49
Gender Female, 50.8%
Calm 54.7%
Disgusted 45%
Fear 45%
Sad 45.1%
Angry 45%
Confused 45%
Surprised 45%
Happy 45.2%

AWS Rekognition

Age 19-31
Gender Male, 54.6%
Happy 45.7%
Surprised 45.1%
Sad 46.1%
Disgusted 45.1%
Angry 46%
Calm 51.9%
Confused 45.1%
Fear 45.1%

Microsoft Cognitive Services

Age 21
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%
Painting 92.7%

Categories

Captions

Microsoft
created on 2019-10-29

a vintage photo of a person 77.7%
a vintage photo of a person holding a book 48.7%

Text analysis

Amazon

ERITHRAA
SIBILLA ERITHRAA
SIBILLA
in
l
Peta
Peta in fornioe acelle atimni /rmlgo) l hestina
hestina
/rmlgo)
atimni
acelle
fornioe

Google

SIBILLA ERITHRA A in
SIBILLA
ERITHRA
A
in