Human Generated Data

Title

Head of a Woman

Date

c. 1525

People

Artist: Unidentified Artist,

Previous attribution: Jacopo Palma (called il Vecchio), Italian c. 1479 - 1528

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1951.147

Human Generated Data

Title

Head of a Woman

People

Artist: Unidentified Artist,

Previous attribution: Jacopo Palma (called il Vecchio), Italian c. 1479 - 1528

Date

c. 1525

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1951.147

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Art 97.6
Face 93.1
Human 93.1
Person 85.2
Head 80.5
Drawing 79.5
Sketch 68.5
Painting 68.2
Photography 65.6
Photo 65.6
Portrait 65.6

Clarifai
created on 2020-04-24

portrait 99.9
people 99.9
one 99.4
adult 99.4
art 98.8
print 98.1
old 96.4
wear 96.3
antique 95.8
man 95.7
vintage 94.9
engraving 94.5
painting 93.7
leader 93.7
illustration 90.6
retro 88
writer 84.1
administration 83.9
facial hair 81.7
scientist 81.6

Imagga
created on 2020-04-24

portrait 27.8
sketch 23.1
face 22.7
window screen 21.9
representation 20.9
money 20.4
cash 19.2
person 18
currency 18
head 17.6
drawing 17.5
man 17.5
screen 17
close 16.6
book jacket 16.1
hair 15.9
banking 15.6
skin 15.5
bill 15.2
protective covering 15.2
covering 15
one 14.9
dollar 14.9
old 14.6
adult 14.2
male 14.2
bank 13.4
people 13.4
jacket 12.5
expression 11.9
finance 11.8
financial 11.6
eyes 11.2
sculpture 11.1
paper 11
mug shot 10.9
banknote 10.7
human 10.5
attractive 10.5
sexy 10.4
business 10.3
model 10.1
wealth 9.9
art 9.8
look 9.6
pay 9.6
serious 9.5
wrapping 9.5
health 9
spa 9
wet 8.9
beard 8.9
dollars 8.7
bath 8.5
adolescent 8.5
bust 8.5
photograph 8.4
black 8.4
pretty 8.4
relaxation 8.4
economy 8.3
emotion 8.3
closeup 8.1
body 8
nose 7.9
banknotes 7.8
shower 7.8
us 7.7
culture 7.7
savings 7.5
juvenile 7.4
note 7.4
sensuality 7.3
lifestyle 7.2
looking 7.2
eye 7.1
smile 7.1

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

sketch 99.4
drawing 99.4
human face 98.1
painting 96.8
person 95.7
text 88.6
old 86.9
art 85.5
window 82.7
portrait 82.2
woman 68.9
picture frame 30

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-20
Gender Female, 97.5%
Fear 0.2%
Surprised 0.2%
Disgusted 5.6%
Calm 84.2%
Sad 6.9%
Angry 1.7%
Happy 0.2%
Confused 0.9%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 85.2%
Painting 68.2%

Categories

Imagga

people portraits 99.2%

Captions

Microsoft
created on 2020-04-24

an old photo of a person 68.3%
old photo of a person 62.9%
an old photo of a building 57.6%