Human Generated Data

Title

Alexander Liberman

Date

1956

People

Artist: Irving Penn, American 1917 - 2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1981.22.2

Copyright

© The Irving Penn Foundation

Human Generated Data

Title

Alexander Liberman

People

Artist: Irving Penn, American 1917 - 2009

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1981.22.2

Copyright

© The Irving Penn Foundation

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 100
Face 100
Person 95.5
Beard 89.5
Head 87.7
Portrait 78.1
Photography 78.1
Photo 78.1
Mustache 68.7

Clarifai
created on 2023-10-25

portrait 100
man 99.6
monochrome 99.3
people 99.3
one 99.1
face 99.1
adult 97.3
guy 97.2
model 95.7
studio 95.6
eye 95.4
black and white 94.8
sepia 93.3
mustache 93
boy 92.6
beard 92.1
dark 91.8
facial hair 91.7
person 90.8
fashion 87.6

Imagga
created on 2022-01-09

beard 75.4
face 49.8
portrait 48.6
mustache 41.2
person 40.4
male 39.8
man 38.4
adult 33.7
model 31.9
black 30.9
attractive 30.8
handsome 30.3
expression 29
people 27.9
serious 25.8
guy 24.6
human 24
close 24
head 23.5
eyes 23.3
looking 22.4
fashion 21.9
studio 21.3
youth 20.5
one 19.4
look 19.3
dark 19.2
hair 19
casual 18.7
skin 17.2
confident 16.4
style 15.6
sexy 15.3
eye 14.3
adolescent 14
make 13.6
sunglasses 13.3
boy 13.1
cute 12.9
teenager 12.8
sensual 11.8
masculine 11.7
depression 11.7
facial 11.5
brunette 11.3
lips 11.1
lifestyle 10.9
closeup 10.8
pensive 10.8
juvenile 10.6
pretty 10.5
nose 10.5
thinking 10.5
men 10.3
emotion 10.2
smile 10
pose 10
hand 9.9
modern 9.8
posing 9.8
cool 9.8
expressive 9.7
hat 9.7
stare 9.7
sad 9.7
professional 9.3
glasses 9.3
alone 9.1
stylish 9.1
suit 8.7
mature 8.4
clothing 8.3
happy 8.2
business 7.9
ear 7.7
stress 7.7
profile 7.7
mouth 7.5
lady 7.3
spectacles 7.3
women 7.1
businessman 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

man 97.6
human face 95.2
face 91.5
person 89.9
looking 89.4
portrait 89
indoor 88.3
text 86.4
black and white 61.4
eyes 51.9
staring 35.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-44
Gender Male, 97.9%
Calm 88.2%
Sad 5.5%
Angry 3.2%
Disgusted 1.1%
Confused 0.7%
Happy 0.5%
Surprised 0.5%
Fear 0.2%

Feature analysis

Amazon

Person 95.5%

Categories

Imagga

pets animals 100%