Human Generated Data

Title

Virgin in the Meadow

Date

1810

People

Artist: Pietro Anderloni, Italian 1784 - 1849

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G59

Human Generated Data

Title

Virgin in the Meadow

People

Artist: Pietro Anderloni, Italian 1784 - 1849

Artist after: Raphael, Italian 1483 - 1520

Date

1810

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G59

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Human 98.8
Person 98.8
Person 97.2
Art 94
Painting 91.7

Clarifai
created on 2019-10-29

people 99.8
child 99.8
baby 99.6
portrait 99
art 98.4
two 98.2
son 96.8
adult 95.7
woman 94.7
one 94.6
family 94.6
man 94.4
print 93.5
painting 93.1
illustration 91.8
sit 91.8
offspring 90.9
love 90.9
group 90.7
girl 90.2

Imagga
created on 2019-10-29

parent 57.2
mother 52.9
child 43.5
dad 31.5
father 28.8
love 26
male 24.7
adult 24.6
sexy 24.1
people 24
attractive 23.8
man 22.2
kin 21.7
couple 20.9
body 20.8
hair 19.8
portrait 19.4
model 18.7
skin 18.6
face 17.8
women 17.4
fashion 16.6
sensuality 15.5
pretty 15.4
erotic 15.2
family 15.1
nude 14.6
naked 14.5
cute 14.4
lady 13.8
relaxation 13.4
happiness 13.3
happy 13.2
smiling 12.3
lifestyle 12.3
person 12.2
sexual 11.6
sitting 11.2
two 11
romance 10.7
healthy 10.7
sex 10.7
kid 10.6
son 10.5
bed 10.4
boy 10.4
passion 10.3
relationship 10.3
black 9.6
husband 9.5
baby 9.3
beach 9.3
hand 9.1
outdoors 9
looking 8.8
together 8.8
eyes 8.6
togetherness 8.5
legs 8.5
lying 8.5
lips 8.3
human 8.3
care 8.2
sensual 8.2
pose 8.2
dress 8.1
childhood 8.1
little 8
brunette 7.8
temptation 7.7
youth 7.7
elegance 7.6
head 7.6
lingerie 7.5
fun 7.5
one 7.5
vacation 7.4
blond 7.3
gorgeous 7.3
figure 7.2
sand 7.2
eye 7.2
handsome 7.1
lovely 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

gallery 98.6
wall 97.1
person 96.6
human face 95.5
clothing 92.1
room 92.1
baby 90.6
indoor 89.6
text 82.4
scene 81.7
smile 81
woman 77.4
picture frame 71.5
art 70.7
toddler 59.8
posing 51.4
painting 33.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 53.7%
Calm 48.4%
Fear 45%
Disgusted 45%
Happy 51.4%
Angry 45%
Surprised 45.2%
Confused 45%
Sad 45%

AWS Rekognition

Age 6-16
Gender Female, 94.3%
Disgusted 0%
Calm 99.4%
Angry 0%
Happy 0.3%
Fear 0%
Surprised 0%
Sad 0.2%
Confused 0%

AWS Rekognition

Age 0-3
Gender Female, 54.2%
Calm 54.7%
Sad 45.1%
Disgusted 45%
Confused 45%
Surprised 45%
Angry 45.1%
Fear 45%
Happy 45.1%

Microsoft Cognitive Services

Age 23
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 91.7%

Categories

Imagga

paintings art 98.4%
pets animals 1.4%

Captions