Human Generated Data

Title

Club Cornich, New York City

Date

February 1977

People

Artist: Larry Fink, American 1941 - 2023

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Neal and Susan Yanofsky, P2004.36

Copyright

© Larry Fink

Human Generated Data

Title

Club Cornich, New York City

People

Artist: Larry Fink, American 1941 - 2023

Date

February 1977

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Neal and Susan Yanofsky, P2004.36

Copyright

© Larry Fink

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Back 99.9
Human 99.6
Person 99.6
Finger 61.2
Undershirt 57.5
Clothing 57.5
Apparel 57.5

Clarifai
created on 2023-10-25

girl 99.7
nude 99.6
portrait 99.5
woman 99
monochrome 99
people 98.2
model 97.3
art 97.1
sexy 97
naked 96.9
studio 96.7
adult 95.5
fashion 94.1
lingerie 94
erotic 93.5
body 93.3
beautiful 92.8
shadow 92.4
lady 91.5
vintage 91.2

Imagga
created on 2022-01-09

sexy 45.8
model 45.2
body 44
erotic 43.9
cover girl 43.4
black 40.1
attractive 38.5
adult 33.1
nude 30.1
person 29.3
skin 28.9
fashion 28.7
hair 27.8
pretty 26.6
naked 26.1
people 25.1
sensuality 24.6
sensual 23.7
lady 23.6
lingerie 23.5
brunette 22.7
portrait 22
face 21.3
studio 20.5
seductive 20.1
dark 20.1
legs 19.8
posing 17.8
sexual 17.4
one 17.2
clothing 16
gorgeous 15.4
love 15
style 14.9
iron 14.4
cute 14.4
passion 14.1
slim 13.8
lifestyle 13.8
dinner dress 13.6
human 13.5
elegance 13.5
sitting 12.9
breast 12.7
pose 12.7
bare 12.7
women 12.7
desire 12.5
dancer 12.5
home appliance 12.5
lovely 12.5
blond 11.9
make 11.8
glamor 11.5
man 10.8
healthy 10.7
sex 10.7
looking 10.4
feminine 10.3
happy 10
dress 10
torso 9.8
underwear 9.7
temptation 9.6
appliance 9.6
garment 9.4
expression 9.4
lips 9.3
figure 9.1
bra 8.8
tan 8.7
consumer goods 8.5
health 8.3
back 8.3
romance 8.1
undressed 7.9
buttocks 7.9
hands 7.8
eyes 7.8
elegant 7.7
girlfriend 7.7
performer 7.7
chair 7.6
wet 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

monitor 99.2
text 99.1
wall 97.5
indoor 93.1
clothing 92.3
television 90.6
person 87.4
human face 83.4
screen 80.8
woman 78.8
black and white 69.2
display 56.2
computer 40.2
flat 33.5
picture frame 14.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Female, 99.8%
Calm 75.4%
Happy 14.2%
Sad 3.4%
Angry 1.9%
Fear 1.5%
Surprised 1.3%
Disgusted 1.2%
Confused 1%

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

events parties 61.2%
text visuals 33.4%
paintings art 2.7%