Human Generated Data

Title

Marlene, Colette, and Naomi on the street, Boston

Date

1973, printed 1990-1991

People

Artist: Nan Goldin, American born 1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.218

Copyright

© Nan Goldin

Human Generated Data

Title

Marlene, Colette, and Naomi on the street, Boston

People

Artist: Nan Goldin, American born 1953

Date

1973, printed 1990-1991

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.218

Copyright

© Nan Goldin

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Accessory 100
Accessories 100
Handbag 100
Bag 100
Person 99.8
Human 99.8
Person 99.7
Person 99.1
Shoe 92
Clothing 92
Apparel 92
Footwear 92
Purse 89.6
Shoe 76.6
Shoe 68.8
Person 65.8
Person 61.3

Clarifai
created on 2018-03-23

people 99.8
woman 98.5
adult 97.4
music 95.3
group 94.4
man 94.1
portrait 93.5
monochrome 93
two 91.4
wear 90.6
musician 90.4
singer 89.6
actress 89
group together 87.9
child 86.9
street 85.8
one 84
dancer 80.8
three 79.9
movie 79.7

Imagga
created on 2018-03-23

fashion 44.5
sexy 36.9
model 36.5
adult 35.7
attractive 35
person 31.6
dress 29.8
portrait 27.2
people 26.8
sensual 24.5
hair 23.8
clothes 23.4
pretty 23.1
style 23
dark 22.5
one 20.9
clothing 20.5
lady 20.3
passion 18.8
happy 18.8
black 18.8
outfit 18.1
blond 17.9
body 17.6
elegant 16.3
stylish 16.3
elegance 15.9
seductive 15.3
face 14.9
studio 14.4
skin 14.4
women 14.2
posing 14.2
interior 14.1
lifestyle 13.7
splashes 13.7
cute 13.6
human 13.5
wet 13.4
erotic 13.2
rain 13.2
brunette 13.1
sensuality 12.7
shower 12.6
fashionable 12.3
drops 12.3
water 12
expression 11.9
makeup 11.9
passionate 11.8
happiness 11.7
vogue 11.6
legs 11.3
enjoy 11.3
dance 11.2
locker 10.9
room 10.6
gesture 10.5
couple 10.4
standing 10.4
looking 10.4
luxury 10.3
casual 10.2
dinner dress 10.1
man 10.1
make 10
gorgeous 10
garment 9.9
seduce 9.8
modern 9.8
pour 9.7
pleasure 9.4
romantic 8.9
fastener 8.7
shop 8.7
sitting 8.6
hairstyle 8.6
male 8.6
smile 8.5
fun 8.2
lovely 8
urban 7.9
length 7.8
device 7.7
two 7.6
charming 7.6
emotion 7.4
costume 7.4
inside 7.4
20s 7.3
boutique 7.3
full 7.3
smiling 7.2
transparent 7.2
night 7.1
love 7.1

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 98.2
posing 85.7
standing 83.1
people 66
white 61.6
group 56.6
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 54.8%
Happy 45.8%
Confused 45.8%
Sad 46.1%
Angry 46.8%
Calm 48.3%
Disgusted 45.5%
Surprised 46.5%

AWS Rekognition

Age 26-43
Gender Female, 55%
Sad 45.3%
Angry 52.3%
Disgusted 45.9%
Surprised 45.4%
Calm 45.3%
Happy 45.3%
Confused 45.5%

AWS Rekognition

Age 26-43
Gender Female, 55%
Happy 45.3%
Disgusted 45.4%
Calm 48.5%
Surprised 46.5%
Sad 46%
Confused 46.9%
Angry 46.4%

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 40
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Handbag 100%
Person 99.8%
Shoe 92%