Human Generated Data

Title

C putting on her makeup at Second Tip, Bangkok

Date

1992

People

Artist: Nan Goldin, American born 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Anne Ehrenkranz in honor of Gabriella De Ferrari, P1994.3

Copyright

© Nan Goldin

Human Generated Data

Title

C putting on her makeup at Second Tip, Bangkok

People

Artist: Nan Goldin, American born 1953

Date

1992

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Anne Ehrenkranz in honor of Gabriella De Ferrari, P1994.3

Copyright

© Nan Goldin

Machine Generated Data

Tags

Amazon
created on 2019-04-02

Person 99.7
Human 99.7
Person 99.7
Bar Counter 80.1
Pub 80.1
Finger 78.1
Hair 60.2
Skin 59.8
Face 55.8
Club 55.8
Back 55.7
Clothing 55.7
Apparel 55.7
Sleeve 55.7

Clarifai
created on 2018-04-18

people 99.4
adult 98.7
woman 97.9
two 97.3
portrait 97.1
music 96.8
one 96.7
man 94.8
model 93.9
wear 93.6
group 90.1
mirror 89.4
indoors 89.2
singer 88.5
facial expression 88.4
monochrome 88.2
fashion 87.6
street 87.3
musician 85.4
movie 85.2

Imagga
created on 2018-04-18

adult 30.6
sexy 29.7
man 27.6
portrait 26.5
couple 26.1
people 25.7
person 24.8
fashion 23.4
male 22.8
black 22.3
attractive 21
love 20.5
body 20
face 19.9
passion 19.7
sensual 19.1
two 17.8
pretty 16.8
hair 16.6
skin 15.7
sexual 15.4
erotic 15.3
lady 14.6
girlfriend 14.4
human 14.2
dark 14.2
model 14
boyfriend 13.5
women 13.4
handsome 13.4
passionate 12.8
sensuality 12.7
sex 12.6
romance 12.5
lovers 11.6
blond 11.4
together 11.4
fan 11.4
style 11.1
expression 11.1
intimacy 10.8
romantic 10.7
happy 10.7
hug 10.6
disk jockey 10.5
one 10.5
lips 10.2
emotion 10.1
lingerie 9.8
lifestyle 9.4
makeup 9.3
follower 9.2
hand 9.1
make 9.1
gorgeous 9.1
night 8.9
men 8.6
youth 8.5
broadcaster 8.4
studio 8.4
leisure 8.3
holding 8.3
fun 8.2
guy 8.1
posing 8
crazy 7.9
embracing 7.8
embrace 7.8
lover 7.8
hugging 7.8
muscular 7.6
wife 7.6
pair 7.6
togetherness 7.6
friends 7.5
relationship 7.5
feminine 7.5
entertainment 7.4
joyful 7.4
dress 7.2
husband 7.2
interior 7.1

Google
created on 2018-04-18

snapshot 81.8
girl 76.1
audio 67.2
muscle 65.4
midnight 54.4

Microsoft
created on 2018-04-18

person 99.4
woman 95.3
indoor 89.1
posing 49.4
female 26.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Female, 77.5%
Angry 0.7%
Sad 8.8%
Calm 86.2%
Happy 0.7%
Surprised 1.3%
Disgusted 0.8%
Confused 1.5%

AWS Rekognition

Age 10-15
Gender Female, 90.3%
Angry 4.8%
Disgusted 2.5%
Confused 4.2%
Surprised 1.2%
Calm 26.2%
Sad 59.8%
Happy 1.3%

Microsoft Cognitive Services

Age 20
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories