Human Generated Data

Title

The Blimp

Date

c. 1946

People

Artist: Irving Penn, American 1917 - 2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1990.8

Copyright

© The Irving Penn Foundation

Human Generated Data

Title

The Blimp

People

Artist: Irving Penn, American 1917 - 2009

Date

c. 1946

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Alexander Liberman, P1990.8

Copyright

© The Irving Penn Foundation

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Person 99.5
Human 99.5
Person 98.9
Art 90.5
Painting 79.1
Advertisement 77.8
Poster 74.5

Clarifai
created on 2021-04-03

people 99.7
man 99.3
two 98.5
affection 98
adult 97.8
woman 97.5
family 97.3
painting 96.7
art 95.7
sepia 95.7
portrait 95.3
love 95.2
retro 94.4
wear 94.3
paper 93.2
sepia pigment 90.1
indoors 88.8
nostalgia 86.8
illustration 84.7
print 84.3

Imagga
created on 2021-04-03

person 24.4
adult 19.3
people 19
body 18.4
man 17.6
portrait 16.8
black 16.3
statue 15.8
model 15.6
male 15
face 14.2
exercise 13.6
human 13.5
sculpture 13.3
standing 13
lifestyle 13
fashion 12.8
hair 12.7
attractive 12.6
posing 12.4
smile 12.1
sexy 12.1
healthy 12
pretty 11.9
sport 11.9
fitness 11.8
art 11.2
love 11.1
world 10.5
health 10.4
action 10.2
happy 10
dress 9.9
active 9.9
lady 9.7
suit 9.5
clothing 9.5
figure 9.3
hand 9.3
pose 9.1
sky 8.9
garment 8.9
style 8.9
creation 8.8
happiness 8.6
bow tie 8.6
men 8.6
dancer 8.6
business 8.5
strength 8.4
summer 8.4
leisure 8.3
child 8.2
looking 8
businessman 7.9
bright 7.9
athlete 7.7
naked 7.7
youth 7.7
old 7.7
life 7.6
dad 7.6
energy 7.6
power 7.6
fun 7.5
outdoors 7.5
silhouette 7.5
makeup 7.3
teenager 7.3
sunset 7.2
activity 7.2
representation 7.1
boy 7.1
look 7

Google
created on 2021-04-03

Microsoft
created on 2021-04-03

gallery 96.7
text 87.8
room 82.1
clothing 81.4
person 77.2
posing 73.5
man 50.8
old 40.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-40
Gender Female, 63.6%
Surprised 69.9%
Confused 12.4%
Calm 7.2%
Happy 5.2%
Angry 2.5%
Fear 1.7%
Disgusted 0.7%
Sad 0.4%

AWS Rekognition

Age 12-22
Gender Female, 74.7%
Calm 74.9%
Happy 8%
Sad 6.2%
Surprised 5.4%
Fear 2%
Confused 1.8%
Angry 1.5%
Disgusted 0.3%

Microsoft Cognitive Services

Age 34
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

m
Pewm
HA
HA m Pewm Nwmplk OK
Nwmplk
OK