Human Generated Data

Title

Alexander Liberman

Date

1956

People

Artist: Irving Penn, American 1917 - 2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1981.22.3

Copyright

© The Irving Penn Foundation

Human Generated Data

Title

Alexander Liberman

People

Artist: Irving Penn, American 1917 - 2009

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1981.22.3

Copyright

© The Irving Penn Foundation

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 100
Face 100
Head 99.1
Person 98.2
Photography 90.2
Photo 90.2
Portrait 90.2
Advertisement 79.9
Poster 75.1
Drawing 75
Art 75
Text 70.8
Man 57.7

Clarifai
created on 2023-10-25

portrait 100
monochrome 99.7
people 99.7
man 99.6
one 98.3
adult 98.3
face 97.8
black and white 96.4
model 93.9
studio 93.2
actor 92.5
guy 92.5
self 91.8
eye 91.1
smile 89.9
person 87.3
art 84.3
facial hair 78.5
mood 75
sepia 74.7

Imagga
created on 2022-01-09

beard 100
portrait 48.6
man 47.1
male 47
face 46.3
person 45.5
handsome 34.8
adult 33
looking 32.1
expression 30.8
people 29.6
guy 28.8
attractive 26.6
eyes 25.9
serious 25.8
human 25.5
close 24.6
black 23.7
model 22.6
one 22.4
look 21.1
head 20.2
men 19.8
casual 19.5
hair 19.1
adolescent 19
youth 18.8
studio 16.7
closeup 16.2
eye 16.1
dark 15.9
boy 14.8
depression 14.6
juvenile 14.5
business 14
happy 13.8
lifestyle 13.8
confident 13.7
stare 13.6
smile 13.6
professional 13.5
suit 13.5
thinking 13.3
businessman 13.3
mature 13
sexy 12.9
alone 12.8
smiling 12.3
pensive 11.7
masculine 11.7
cheerful 11.4
skin 11.4
fashion 11.3
success 11.3
senior 11.3
style 11.1
glasses 11.1
hand 10.7
sad 10.6
nose 10.5
emotion 10.2
pretty 9.8
expressive 9.7
think 9.6
good 9.4
friendly 9.2
to 8.9
cute 8.6
tie 8.6
shirt 8.4
old 8.4
lips 8.3
single 8.2
gray 8.1
corporate 7.7
modern 7.7
executive 7.6
trendy 7.5
successful 7.3
lady 7.3
blond 7.3
office 7.2
happiness 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

man 99.9
wall 99.7
human face 98.2
text 96.9
indoor 96.5
person 96.3
portrait 94.4
book 90.8
looking 85.7
forehead 75.4
black and white 73.7
eyebrow 64.8
chin 64.8
clothing 64.7
face 63.2
jaw 55.6
wrinkle 50.6
male 15.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Calm 98.3%
Happy 0.6%
Disgusted 0.3%
Angry 0.3%
Sad 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0.1%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%

Categories

Imagga

paintings art 62%
people portraits 37.9%