Human Generated Data

Title

The First Steps

Date

c. 1780-1785

People

Artist: Jean-Honoré Fragonard, French 1732 - 1806

Artist: Marguerite Gérard, French 1761 - 1837

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Charles E. Dunlap, 1961.166

Human Generated Data

Title

The First Steps

People

Artist: Jean-Honoré Fragonard, French 1732 - 1806

Artist: Marguerite Gérard, French 1761 - 1837

Date

c. 1780-1785

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-07-16

Painting 99.9
Art 99.9
Person 99.7
Human 99.7
Person 99
Person 97.8
Person 89.8
Person 69.1

Clarifai
created on 2019-07-16

people 99.8
religion 99.1
adult 99
woman 98.7
group 97.4
art 96.6
wear 95.2
painting 94.6
one 90.4
two 87.8
child 87.4
sit 86.6
reclining 86.6
furniture 86.2
man 86.1
position 85.7
three 84.2
veil 83.8
seat 83.8
recreation 82.8

Imagga
created on 2019-07-16

person 31
dancer 25.1
adult 23
people 20.6
performer 20.6
attractive 20.3
male 19.9
man 19.5
couple 19.2
happy 18.2
portrait 16.2
fashion 15.8
fan 15.7
lifestyle 15.2
smile 15
entertainer 14.8
model 14.8
lady 14.6
love 14.2
style 14.1
couch 13.5
religion 13.4
sitting 12.9
kin 12.8
hair 12.7
follower 12.7
together 12.3
sexy 12
pretty 11.9
elegance 11.8
happiness 11.7
dress 11.7
room 11.2
art 11.1
sensuality 10.9
holiday 10.7
human 10.5
looking 10.4
men 10.3
culture 10.3
traditional 10
face 9.9
life 9.9
statue 9.8
old 9.8
interior 9.7
black 9.6
body 9.6
home 9.6
antique 9.5
two 9.3
oriental 9.3
stage 9.1
gold 9
handsome 8.9
posing 8.9
sculpture 8.7
mother 8.7
smiling 8.7
god 8.6
elegant 8.6
religious 8.4
joy 8.3
vintage 8.3
makeup 8.2
romantic 8
women 7.9
cute 7.9
sofa 7.8
pray 7.8
luxury 7.7
spiritual 7.7
faith 7.7
relax 7.6
erotic 7.6
tradition 7.4
peace 7.3
indoor 7.3
sensual 7.3
child 7.3
romance 7.1
father 7.1
night 7.1
look 7

Google
created on 2019-07-16

Microsoft
created on 2019-07-16

painting 98.8
person 90.6
art 88.6
clothing 78.4
woman 73.5
human face 63.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 60-90
Gender Female, 89.1%
Surprised 4.4%
Happy 28.8%
Disgusted 1.6%
Sad 47.3%
Angry 5.9%
Confused 2.4%
Calm 9.6%

AWS Rekognition

Age 16-27
Gender Female, 82.8%
Disgusted 1%
Calm 15.4%
Happy 10.2%
Confused 1%
Surprised 1.3%
Sad 68.9%
Angry 2.2%

AWS Rekognition

Age 20-38
Gender Female, 57.1%
Calm 56.3%
Surprised 4.3%
Happy 10.7%
Angry 3.9%
Sad 20.2%
Disgusted 0.8%
Confused 3.8%

AWS Rekognition

Age 4-7
Gender Male, 54.8%
Disgusted 45.3%
Surprised 45.5%
Confused 45.6%
Happy 45.1%
Calm 49.3%
Sad 47.9%
Angry 46.3%

Microsoft Cognitive Services

Age 25
Gender Male

Microsoft Cognitive Services

Age 16
Gender Female

Microsoft Cognitive Services

Age 71
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.9%
Person 99.7%

Captions

Microsoft

a person standing next to a stuffed animal 29.2%
a person standing in front of a stuffed animal 28.9%