Human Generated Data

Title

Head of a Woman

Date

16th-17th century

People

Artist: Unidentified Artist,

Previous attribution: Peter Paul Rubens, Flemish 1577 - 1640

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Yves Henry Buhler, for study purposes, 1955.68

Human Generated Data

Title

Head of a Woman

People

Artist: Unidentified Artist,

Previous attribution: Peter Paul Rubens, Flemish 1577 - 1640

Date

16th-17th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Yves Henry Buhler, for study purposes, 1955.68

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 100
Face 100
Skin 99.5
Head 88.2
Photo 78.8
Portrait 78.8
Photography 78.8
Art 75.8
Painting 75.8
Person 75
Freckle 56

Clarifai
created on 2020-04-24

portrait 98.7
one 98.7
people 98.7
man 98.3
old 97.4
adult 95.7
art 95.6
antique 93.6
texture 93.3
paper 90.5
retro 90.1
vintage 86.5
wear 84.4
dirty 84.3
artistic 83
ancient 81.9
monochrome 79.8
dark 77.8
sepia 74.4
desktop 74.4

Imagga
created on 2020-04-24

old 26.5
grunge 26.4
negative 25.2
face 24.9
mug shot 23.6
portrait 22.6
aged 22.6
vintage 22.3
antique 21.6
photograph 21.2
film 20.9
man 20.2
ancient 19.9
male 19.1
person 19.1
grungy 19
texture 18.8
wall 17.3
representation 17.2
art 16.8
paper 16.5
black 16.2
skin 15.9
retro 15.6
close 15.4
body 15.2
photographic paper 14.7
paint 14.5
border 14.5
model 14
attractive 14
adult 13.6
canvas 13.3
rough 12.8
head 12.6
people 12.3
creation 12
space 11.6
design 11.6
parchment 11.5
worn 11.5
serious 11.4
human 11.2
expression 11.1
emotion 11.1
pattern 10.9
frame 10.8
handsome 10.7
world 10.7
aging 10.5
sexy 10.4
empty 10.3
blank 10.3
guy 10.1
wallpaper 10
adolescent 9.9
backdrop 9.9
material 9.8
photographic equipment 9.8
backgrounds 9.7
one 9.7
text 9.6
damaged 9.5
old fashioned 9.5
fashion 9
spa 9
juvenile 8.9
look 8.8
hair 8.7
hands 8.7
naked 8.7
water 8.7
lifestyle 8.7
stained 8.7
eyes 8.6
men 8.6
screen 8.5
drops 8.5
book jacket 8.4
health 8.3
dirty 8.1
detail 8
looking 8
textured 7.9
smile 7.8
window screen 7.8
stone 7.8
decay 7.7
muscular 7.6
cemetery 7.6
dirt 7.6
rusty 7.6
bath 7.6
athlete 7.6
decorative 7.5
dark 7.5
covering 7.4
closeup 7.4
fitness 7.2

Google
created on 2020-04-24

Face 96.1
Photograph 95.8
Head 91.1
Chin 88.3
Nose 87
Portrait 84.1
Self-portrait 83.4
Jaw 83.1
Forehead 78.2
Human 74.7
Art 65.5
Photography 62.4
Stock photography 62.1
Neck 59.9
Chest 57
Black-and-white 56.4
Ear 51.2

Microsoft
created on 2020-04-24

text 89.9
human face 89.5
drawing 84.3
sketch 81.9
painting 64.9
black and white 64.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-32
Gender Male, 75.9%
Calm 35%
Disgusted 0.2%
Surprised 0.2%
Fear 0.2%
Happy 4.2%
Angry 0.4%
Sad 59.3%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 75.8%
Person 75%

Categories

Captions

Microsoft
created on 2020-04-24

a close up of a mans face 70.2%
a close up of a persons face 70.1%
close up of a mans face 64.6%