Human Generated Data

Title

Portrait of a Woman

Date

1820s

People

Artist: Unidentified Artist,

Previous attribution: William Dunlap, American 1766 - 1839

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, 1935.32

Human Generated Data

Title

Portrait of a Woman

People

Artist: Unidentified Artist,

Previous attribution: William Dunlap, American 1766 - 1839

Date

1820s

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, 1935.32

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Art 93.9
Painting 91
Person 81.1
Human 81.1

Clarifai
created on 2018-03-16

people 99.8
one 99.3
portrait 99.2
adult 98.7
wear 95.7
woman 95.5
elderly 94.5
old 92.8
art 90.8
print 90.2
painting 89.6
veil 87.4
person 86.9
leader 85.8
necklace 81
facial expression 77
dress 76.9
jacket 76.8
famous 74.3
hair 73.8

Imagga
created on 2018-03-16

person 37.8
portrait 36.9
model 36.6
attractive 36.4
costume 33.9
fashion 32.4
pretty 32.2
princess 31.1
adult 29.8
hair 28.6
lady 28.4
face 26.3
smile 25.7
people 25.1
cloak 25
happy 24.5
brunette 24.4
aristocrat 22
sexy 21.7
make 20.9
smiling 20.3
pose 19.9
dress 19.9
expression 19.6
cute 19.4
hairstyle 19.1
covering 19.1
posing 18.7
black 18.2
eyes 16.4
clothes 15.9
human 15.8
studio 15.2
crown 15.2
makeup 14.7
ethnic 14.3
lovely 14.2
hat 14.1
gorgeous 13.6
style 13.4
jacket 12.1
lips 12
looking 12
elegant 12
culture 12
skin 11.9
sensual 11.8
sensuality 11.8
elegance 11.8
traditional 11.7
crown jewels 11.5
wearing 11.4
jewelry 10.7
look 10.5
blond 10.5
body 10.4
clothing 10.4
women 10.3
nice 10.1
teenager 10
chain mail 9.9
fashionable 9.5
tradition 9.2
joy 9.2
one 9
braid 8.6
youth 8.5
teen 8.3
long 8.3
holding 8.3
stylish 8.1
brown 8.1
closeup 8.1
body armor 8.1
cover girl 7.9
happiness 7.8
modern 7.7
bride 7.7
cosmetic 7.7
casual 7.6
hand 7.6
head 7.6
feminine 7.5
emotion 7.4
slim 7.4
20s 7.3
cheerful 7.3
business 7.3

Google
created on 2018-03-16

portrait 94.4
painting 88
lady 85.2
art 82.5
vintage clothing 50

Microsoft
created on 2018-03-16

person 99.1
woman 93

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-65
Gender Male, 75.5%
Confused 0.5%
Sad 2.8%
Calm 63.4%
Disgusted 0.4%
Happy 0.5%
Surprised 0.6%
Angry 31.8%

Microsoft Cognitive Services

Age 64
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 91%
Person 81.1%

Captions

Text analysis

Amazon

dancoauknocno