Human Generated Data

Title

Self-Portrait

Date

1977

People

Artist: Judith Golden, American born 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.194.7

Copyright

© Judith Golden

Human Generated Data

Title

Self-Portrait

People

Artist: Judith Golden, American born 1934

Date

1977

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 97.7
Face 97.7
Sunglasses 95.2
Accessory 95.2
Accessories 95.2
Head 92.8
Person 86.1
Advertisement 82.6
Poster 82.6
Text 70.4
Collage 68.5

Imagga
created on 2022-02-26

makeup 54.8
slick 48.7
portrait 46.7
face 44.9
lipstick 42
pretty 40
attractive 37.9
make 36.4
model 35.9
person 34.4
hair 33.4
fashion 31.8
eyes 31.1
cosmetic 31
cute 28.8
lips 28.8
adult 28.5
sexy 28.2
skin 26.7
lady 26.1
smile 24.3
people 24.1
brunette 22.7
happy 22.6
smiling 22.5
covering 22.4
gorgeous 21.8
close 20.6
youth 20.5
mask 20.2
posing 18.7
looking 18.5
look 18.5
eye 17
cosmetics 16.9
women 16.7
disguise 16.6
toiletry 15
clothing 15
one 15
expression 14.5
pose 14.5
hand 14.5
attire 14.3
cheerful 13.9
studio 13.7
human 13.5
happiness 13.4
closeup 12.8
sensuality 12.8
elegance 12.6
hairstyle 12.4
mouth 12.4
style 11.9
sensual 11.9
head 11.8
lovely 11.6
health 11.2
stylish 10.9
black 10.9
natural 10.7
lifestyle 10.2
joy 10.1
care 9.9
fresh 9.8
rouge 9.8
brush 9.6
brown 9.6
seductive 9.6
thinking 9.5
healthy 9.5
girls 9.1
luxury 8.6
casual 8.5
20s 8.3
fun 8.3
color 7.8
thoughtful 7.8
comedian 7.7
fashionable 7.6
blond 7.5
confident 7.3
businesswoman 7.3
body 7.2
teeth 7.1

Google
created on 2022-02-26

Forehead 98.5
Cheek 97.9
Lip 97
Chin 96.6
Lipstick 94.6
Eyebrow 94.2
Eyelash 92.5
Jaw 88.1
Font 83.3
Eye liner 83.1
Art 81.8
Headpiece 78.8
Black hair 78.7
Poster 75.8
Earrings 75.2
Long hair 74.6
Makeover 72.5
Eyewear 72.1
Wig 71.5
Eye shadow 70.7

Microsoft
created on 2022-02-26

person 98.8
human face 88.8
text 88.2
poster 82.3
face 64.8
woman 63.2
spectacles 9.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-34
Gender Female, 100%
Angry 98.5%
Calm 1.1%
Surprised 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0%
Happy 0%
Confused 0%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Sunglasses 95.2%
Person 86.1%

Captions

Microsoft

a close up of a person wearing a mask 75.9%
close up of a person wearing a mask 72.5%
a close up of a person wearing a costume 71.8%

Text analysis

Amazon

PLAYS
IT
IT HOT
HOT
to
dark
the
but
the image
image
to break
lived, the rule
rule
too
but Hy the
lived,
break
mar
Hy the

Google

IT
FLAYS
lavk
tut
the
y
FLAYS lavk IT HOT tut y the the rul
HOT
rul