Human Generated Data

Title

Virgin Praying

Date

18th century

People

Artist: Gustav Leybold,

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2359

Human Generated Data

Title

Virgin Praying

People

Artist: Gustav Leybold,

Date

18th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2359

Machine Generated Data

Tags

Amazon
created on 2019-08-08

Art 97.1
Human 96.1
Person 96.1
Apparel 93.8
Clothing 93.8
Hat 88
Bonnet 88
Painting 82.3

Clarifai
created on 2019-08-08

people 98.7
portrait 97.9
woman 96
adult 95.5
one 94.9
art 94.3
painting 92.8
retro 92.6
wear 89.2
empty 87.3
fashion 84.9
indoors 84.6
museum 84.2
blank 84.2
girl 82.6
picture frame 82.6
exhibition 80.8
desktop 79.3
child 78.5
music 78.3

Imagga
created on 2019-08-08

covering 38.1
book jacket 29.1
cloak 29
portrait 27.2
jacket 27.1
fashion 23.4
black 22.4
person 20.6
adult 20.1
model 17.9
mug shot 17.8
face 17.8
hair 17.4
wrapping 17.2
people 16.7
money 16.2
man 16.1
attractive 16.1
representation 15.3
studio 15.2
brunette 14.8
photograph 14
business 14
clothing 13.5
currency 13.5
creation 13.4
expression 12.8
style 12.6
pretty 12.6
cash 11.9
dress 11.7
male 11.3
sexy 11.2
make 10.9
dark 10.9
lady 10.5
one 10.5
youth 10.2
dollar 10.2
vintage 9.9
bank 9.9
posing 9.8
cap 9.7
art 9.6
symbol 9.4
old 9.1
human 9
religion 9
statue 8.9
mortarboard 8.8
costume 8.8
happy 8.8
hairstyle 8.6
close 8.6
bill 8.6
culture 8.5
gown 8.5
finance 8.4
clothes 8.4
elegance 8.4
church 8.3
banking 8.3
holding 8.3
sensuality 8.2
wealth 8.1
paper 7.8
elegant 7.7
sign 7.5
economy 7.4
long 7.3
sensual 7.3
success 7.2
suit 7.2
looking 7.2
cute 7.2
financial 7.1
businessman 7.1
happiness 7
museum 7

Google
created on 2019-08-08

Microsoft
created on 2019-08-08

painting 98.6
gallery 97.7
drawing 96.1
human face 95.2
room 94.7
scene 93.9
person 89.2
text 88.6
art 87
clothing 86
sketch 83.1
museum 78.1
woman 70.8
picture frame 37.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 98.2%
Surprised 0.5%
Calm 91.3%
Disgusted 0.7%
Happy 0.3%
Sad 2.7%
Fear 0.2%
Angry 2.1%
Confused 2.2%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.1%
Painting 82.3%

Categories

Text analysis

Amazon

oban
Chudenit.
Herrn
'ernin,
ndolpb
oban ndolpb Srafen 'ernin, 3u
Srafen
bon, Chudenit.
Hmer
&llons
Hmer &llons shem. Hehpbernen Herrn Herrn
Hehpbernen
bon,
shem.
3u
Nonlel
ing.
Nonlel e ing.
e
mene

Google

eer Cellns dem.Hogormen Hrrn Hen obann Rudolph Grafen zernin, bon a. gu Chudenit;,
eer
Cellns
dem.Hogormen
Hrrn
Hen
obann
Rudolph
Grafen
zernin,
bon
a.
gu
Chudenit;,