Human Generated Data

Title

Self-Portrait

Date

20th century

People

Artist: Irving Penn, American 1917 - 2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1990.7

Copyright

© The Irving Penn Foundation

Human Generated Data

Title

Self-Portrait

People

Artist: Irving Penn, American 1917 - 2009

Date

20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1990.7

Copyright

© The Irving Penn Foundation

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 97.6
Human 97.6
Tripod 94.8
Clothing 89.9
Apparel 89.9
Shoe 87.7
Footwear 87.7
Suit 81.7
Overcoat 81.7
Coat 81.7
Shoe 80.6
Photo 79
Photography 79
Face 70.8
Portrait 68.9
Photographer 67.5
Electronics 59.6
Man 57.1

Clarifai
created on 2023-10-26

people 99.8
art 99
man 98.3
adult 97.4
movie 97.3
tripod 97.3
one 96.6
lens 96.4
portrait 95.9
journalist 95.6
vintage 95.6
two 94.7
documentary 92.5
actor 91.3
analogue 91.1
street 90.9
photojournalism 90.9
retro 90.9
rangefinder 88
snapshot 87.3

Imagga
created on 2022-01-22

man 31.6
musical instrument 27.9
mask 26
wind instrument 25.2
male 24.8
photographer 21.2
person 20.1
soldier 18.6
tripod 17
accordion 16.7
silhouette 16.5
military 16.4
people 16.2
protection 15.5
gun 14.8
danger 14.5
black 14
weapon 14
portrait 13.6
keyboard instrument 13.5
device 13.5
men 12.9
army 12.7
protective 12.7
adult 12.6
war 12.5
uniform 12.1
safety 12
brass 11.7
gas 11.6
rack 11.1
statue 10.9
radiation 10.8
toxic 10.7
clothing 10.7
chemical 10.6
support 10.5
covering 10.5
style 10.4
business 10.3
dark 10
rifle 9.9
sunset 9.9
radioactive 9.8
destruction 9.8
disaster 9.8
nuclear 9.7
urban 9.6
horn 9.2
building 8.9
businessman 8.8
protective covering 8.8
music 8.5
suit 8.4
outdoor 8.4
power 8.4
equipment 8.3
human 8.2
sport 8.2
industrial 8.2
dirty 8.1
harmonica 8
helmet 7.9
armed 7.9
protect 7.7
pollution 7.7
dangerous 7.6
technology 7.4
action 7.4
art 7.3

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

wall 97.3
text 97.2
piano 96.5
footwear 81.9
furniture 77
clothing 76.8
person 67
musical instrument 56.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 86.3%
Angry 11.3%
Confused 0.9%
Surprised 0.4%
Sad 0.4%
Fear 0.3%
Happy 0.3%
Disgusted 0.2%

Microsoft Cognitive Services

Age 41
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Shoe 87.7%

Categories