Human Generated Data

Title

Untitled (man applying lipstick and lip liner to woman's face)

Date

c. 1960

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15421

Human Generated Data

Title

Untitled (man applying lipstick and lip liner to woman's face)

People

Artist: Jack Gould, American

Date

c. 1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.3
Human 99.3
Face 96.4
Finger 86.3
Mouth 68.8
Lip 68.8
Blonde 55
Kid 55
Child 55
Girl 55
Female 55
Woman 55
Teen 55

Imagga
created on 2022-03-05

face 38.4
portrait 32.4
person 31.2
attractive 30.8
makeup 30.3
people 25.7
pretty 25.2
lipstick 23.3
lips 23.2
hair 22.2
fashion 21.9
make 21.8
smile 21.4
device 21.1
adult 20.7
lady 20.3
women 19.8
model 19.5
crayfish 19.2
hand 19
syringe 18.9
happy 18.8
skin 18.4
close 18.3
medical instrument 16.3
health 16
smiling 15.9
human 15.8
one 15.7
eyes 15.5
sexy 15.3
cute 15.1
lifestyle 13.7
cosmetic 13.7
looking 13.6
eye 13.4
arthropod 12.9
closeup 12.8
instrument 12.6
crustacean 12.5
care 12.4
work 11.8
serious 11.5
man 11.4
mouth 11.3
cosmetics 11.2
business 10.9
holding 10.7
look 10.5
brunette 10.5
style 10.4
feminine 10.3
youth 10.2
happiness 10.2
healthy 10.1
teenager 10
brush 10
teeth 9.5
casual 9.3
professional 9.3
glasses 9.3
phone 9.2
girls 9.1
telephone 8.6
expression 8.5
male 8.5
walking stick 8.4
black 8.4
modern 8.4
head 8.4
treatment 8.3
gorgeous 8.2
dental appliance 8.1
fresh 7.9
talking 7.6
eating 7.6
call 7.6
life 7.5
joy 7.5
retainer 7.5
restraint 7.2
mirror 7.2
salon 7.1

Google
created on 2022-03-05

Nose 98.4
Cheek 97.9
Lip 97
Chin 96.6
Lipstick 94.9
Mouth 93.4
Eyelash 92.5
Jaw 88
Gesture 85.3
Cosmetics 85
Makeover 83.9
Pink 83.8
Eye liner 82.6
Magenta 81.2
Art 79.4
Beauty 76
Eye shadow 75.1
Music artist 74.9
Font 73.1
Paint 72.8

Microsoft
created on 2022-03-05

person 99.6
human face 73.8
lipstick 65.1
face 60.1
lip 53.5
toothbrush 20.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Female, 99.9%
Calm 84.7%
Sad 11.3%
Happy 1.2%
Angry 0.7%
Fear 0.6%
Surprised 0.6%
Confused 0.5%
Disgusted 0.3%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a close up of a woman brushing her teeth 26.4%
close up of a woman brushing her teeth 25.8%
a woman brushing her teeth 25.7%

Text analysis

Amazon

ou-Sa
eee
P51
MJ13--YT3 eee
MJ13--YT3

Google

PS1
MJIA- -YT3FA2- -NAGON e ɛ ɛ PS1
MJIA-
-YT3FA2-
-NAGON
ɛ
e