Human Generated Data

Title

Virgin in the Meadow

Date

19th century

People

Artist: Joseph Steinm├╝ller, German 1795 - 1841

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3713

Human Generated Data

Title

Virgin in the Meadow

People

Artist: Joseph Steinm├╝ller, German 1795 - 1841

Artist after: Raphael, Italian 1483 - 1520

Date

19th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-06

Human 98.7
Person 98.7
Person 98
Baby 97.8
Face 95.3
Art 91.9
Female 90.3
Newborn 78.3
Photo 76.6
Photography 76.6
Portrait 76.5
Girl 69.9
Woman 69.7
Child 69.1
Kid 69.1
Painting 68.4
Drawing 63.3
People 61.2

Clarifai
created on 2019-11-06

people 99.8
child 99.7
portrait 99.1
two 97.3
baby 96.6
son 95.7
girl 95
adult 94.7
wear 94.3
offspring 93.5
one 93.4
group 92.5
woman 92
art 91.9
family 91.8
facial expression 89.6
music 88.6
retro 88.1
love 88
sibling 87.1

Imagga
created on 2019-11-06

kin 51.9
parent 42.5
mother 40
father 37.4
child 36.5
family 35.6
happy 33.8
dad 32.9
man 30.2
love 30
people 29
portrait 24.6
couple 24.4
adult 23.3
daughter 23.2
smiling 23.1
male 22.1
happiness 21.9
boy 20.9
together 19.3
kid 18.6
wife 18
home 17.5
husband 17.3
sitting 17.2
attractive 16.8
smile 16.4
cute 15.8
face 15.6
baby 14.6
casual 14.4
pretty 14
lifestyle 13.7
son 13.4
lady 13
looking 12.8
women 12.6
world 12.3
relationship 12.2
fun 12
person 11.7
group 11.3
model 10.9
sofa 10.7
fashion 10.5
togetherness 10.4
culture 10.3
joy 10
childhood 9.8
handsome 9.8
cheerful 9.8
hair 9.5
relaxed 9.4
adorable 9.2
room 9.1
relaxing 9.1
one 9
sexy 8.8
body 8.8
affectionate 8.7
couch 8.7
loving 8.6
youth 8.5
house 8.4
leisure 8.3
vintage 8.3
holding 8.3
human 8.2
children 8.2
brunette 7.8
parents 7.8
black 7.8
men 7.7
girlfriend 7.7
relaxation 7.5
rest 7.4
guy 7.4
girls 7.3
little 7.1
indoors 7

Google
created on 2019-11-06

Microsoft
created on 2019-11-06

gallery 99.6
room 99.5
scene 99.5
human face 96.6
person 96.4
clothing 95.5
smile 94
posing 91.1
text 82.6
woman 79.2
baby 76.4
old 67.9
boy 51.1
painting 40.1
picture frame 14.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 54.4%
Disgusted 45%
Fear 45%
Calm 49.9%
Surprised 46.2%
Happy 48.4%
Confused 45.1%
Angry 45.3%
Sad 45%

AWS Rekognition

Age 15-27
Gender Female, 54.8%
Calm 49.7%
Surprised 45.1%
Disgusted 45.1%
Angry 48.5%
Happy 46.4%
Sad 45.2%
Fear 45%
Confused 45.1%

AWS Rekognition

Age 13-25
Gender Female, 54.5%
Happy 45%
Angry 45%
Calm 54.5%
Surprised 45%
Sad 45.4%
Fear 45%
Confused 45%
Disgusted 45%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Painting 68.4%

Captions

Microsoft

a painting of a man 88.7%
a painting of a man and a woman posing for a photo 57.6%
a painting of a man and woman posing for a photo 46.3%

Text analysis

Amazon

4..