Human Generated Data

Title

Women and Children in a Garden

Date

1875-1900

People

Artist: Unidentified Artist,

Previous attribution: Jean Frédéric Bazille, French 1841 - 1870

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Margarett Sargent McKean, 1971.124

Human Generated Data

Title

Women and Children in a Garden

People

Artist: Unidentified Artist,

Previous attribution: Jean Frédéric Bazille, French 1841 - 1870

Date

1875-1900

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Painting 99.9
Art 99.9
Human 96.4
Person 96.4
Person 94.4
Person 93.7

Clarifai
created on 2019-11-09

painting 99.8
art 99.8
people 99.1
woman 98.8
veil 98.6
adult 97.8
wear 96.8
Renaissance 96.5
dress 96.4
baby 93.6
gown (clothing) 93.4
illustration 92.7
baroque 90.7
child 90.5
one 90.4
Mary 90
man 89.6
furniture 88.2
princess 87.8
print 87.3

Imagga
created on 2019-11-09

mosquito net 85.5
protective covering 52.1
covering 35.4
four-poster 27.5
bed 26.8
adult 21.3
furniture 20
people 19.5
portrait 19.4
person 19
dress 19
fashion 18.1
attractive 17.5
bedroom furniture 16.5
love 15.8
face 15.6
sensuality 13.6
dark 13.4
happiness 13.3
sexy 12.8
human 12.7
elegance 12.6
pretty 12.6
happy 12.5
model 12.4
smiling 12.3
room 12.1
groom 12.1
smile 12.1
light 12
sitting 12
hair 11.9
interior 11.5
man 11.4
stone 11
child 10.7
bride 10.5
alone 10
furnishing 10
lady 9.7
one 9.7
male 9.3
wedding 9.2
cradle 9.2
travel 9.1
old 9
color 8.9
night 8.9
lifestyle 8.7
mystery 8.6
cute 8.6
expression 8.5
wall 8.4
fun 8.2
style 8.2
posing 8
life 7.8
glass 7.8
luxury 7.7
window 7.4
cave 7.4
baby bed 7.3
black 7.2
bedroom 7.2
religion 7.2
women 7.1

Google
created on 2019-11-09

Painting 98.4
Picture frame 93.8
Art 90.1
Lady 88.8
Visual arts 86.8
Modern art 81.7
Still life 61.9
Stock photography 59.4
Portrait 59.3
Artwork 52.3

Microsoft
created on 2019-11-09

indoor 95.5
drawing 93.6
woman 81.7
person 79
text 76.7
art 73.7
painting 71.9
clothing 67.2
human face 50.6
picture frame 49.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 52.5%
Disgusted 45.1%
Sad 46.8%
Fear 45.1%
Calm 52.4%
Angry 45.2%
Happy 45%
Surprised 45.1%
Confused 45.3%

AWS Rekognition

Age 2-8
Gender Female, 53.7%
Surprised 45.1%
Disgusted 45%
Angry 45.1%
Calm 53.6%
Fear 45.1%
Confused 45.1%
Happy 45.1%
Sad 46%

AWS Rekognition

Age 14-26
Gender Female, 53.6%
Happy 45%
Disgusted 45.1%
Angry 45.2%
Fear 45.1%
Calm 45.6%
Surprised 45%
Confused 45.1%
Sad 53.9%

Microsoft Cognitive Services

Age 3
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.9%
Person 96.4%

Captions

Microsoft

a painting of a person 80.3%
a painting of a person 80.2%
a painting on the wall 80.1%