Human Generated Data

Title

The Beautiful Feronniere

Date

19th century

People

Artist: François Eugène Augustin Bridoux, French 1813 - 1892

Artist after: Leonardo da Vinci, Italian 1452 - 1519

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G543

Human Generated Data

Title

The Beautiful Feronniere

People

Artist: François Eugène Augustin Bridoux, French 1813 - 1892

Artist after: Leonardo da Vinci, Italian 1452 - 1519

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G543

Machine Generated Data

Tags

Amazon
created on 2019-11-06

Human 98.9
Person 98.9
Art 98.7
Painting 92.3
Art Gallery 83.5

Clarifai
created on 2019-11-06

people 98.7
portrait 98.6
painting 97.4
adult 97
one 96.4
museum 96.3
woman 95.9
wear 94.9
retro 92.4
exhibition 92.3
art 91.2
picture frame 88.5
girl 87.6
wall 85.8
man 83.7
album 82.1
empty 81.7
nostalgia 81
landscape 79.9
vintage 79.1

Imagga
created on 2019-11-06

portrait 29.1
attractive 25.9
person 23.4
adult 23.4
people 22.3
model 21
fashion 20.3
lady 18.7
sexy 18.5
face 18.5
pretty 17.5
groom 15.7
happy 15.7
dress 15.4
hair 15.1
brunette 14.8
black 14.5
bride 13.3
bow tie 12.9
man 12.8
style 12.6
elegance 12.6
old 12.5
vintage 12.4
clothing 12.3
smiling 12.3
cute 12.2
studio 12.2
smile 12.1
expression 11.9
love 11.8
business 11.5
looking 11.2
youth 11.1
wedding 11
gorgeous 10.9
happiness 10.2
male 10
posing 9.8
human 9.7
businessman 9.7
eyes 9.5
sitting 9.4
lifestyle 9.4
statue 9.4
sensuality 9.1
holding 9.1
make 9.1
bouquet 9
cheerful 8.9
suit 8.7
couple 8.7
necktie 8.6
clothes 8.4
room 8.4
makeup 8.2
blackboard 8.2
sensual 8.2
office 8.1
sculpture 8.1
home 8
women 7.9
look 7.9
child 7.8
standing 7.8
gown 7.8
blond 7.8
color 7.8
luxury 7.7
elegant 7.7
married 7.7
hairstyle 7.6
executive 7.6
laptop 7.5
world 7.5
indoor 7.3
mother 7.3
pose 7.2
body 7.2
romantic 7.1
lovely 7.1
boy 7.1
modern 7

Google
created on 2019-11-06

Microsoft
created on 2019-11-06

gallery 99.2
room 98.6
scene 98.5
art 97.8
wall 97.3
indoor 96.7
drawing 95.8
human face 95.1
person 91.2
white 88.8
woman 88.2
museum 87.7
clothing 85.9
sketch 85.1
old 41.3
painting 17.9
picture frame 11.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-44
Gender Female, 90.2%
Disgusted 0.1%
Angry 1.5%
Fear 0.2%
Calm 95.6%
Confused 0.4%
Happy 0.3%
Sad 1.1%
Surprised 0.7%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 99.9%