Human Generated Data

Title

Penn Station

Date

c. 1956, printed later

People

Artist: Louis Stettner, American 1922 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.566

Copyright

© Louis Stettner Estate

Human Generated Data

Title

Penn Station

People

Artist: Louis Stettner, American 1922 - 2016

Date

c. 1956, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.566

Copyright

© Louis Stettner Estate

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Person 99.6
Human 99.6
Person 99.2
Person 99.1
Person 99
Crowd 89.5
Attorney 74.9
Press Conference 59
Graduation 57.8
Court 56
Room 56
Indoors 56
Audience 55.4

Clarifai
created on 2018-02-10

people 99.5
airport 98.4
adult 98.3
man 97.1
group 96.8
woman 96.3
train 95.4
subway system 94.4
indoors 94.2
group together 93.6
monochrome 93.4
vehicle 90.7
wear 90.4
street 90.3
railway 89.8
room 89.2
transportation system 88.7
several 88.6
one 87.9
leader 87.6

Imagga
created on 2018-02-10

passenger 34.2
man 32.2
adult 31.1
car 30.7
sitting 28.3
people 24.5
male 24.1
vehicle 23.2
person 21.8
driver 20.4
happy 18.2
smiling 18.1
couple 17.4
business 16.4
automobile 15.3
groom 14.9
looking 14.4
portrait 14.2
auto 13.4
happiness 13.3
attractive 12.6
transportation 12.5
businessman 12.4
drive 12.3
black 12.2
driving 11.6
lifestyle 11.6
corporate 11.2
men 11.2
women 11.1
love 11
work 11
squeegee 11
smile 10.7
face 10.6
group 10.5
luxury 10.3
seat 10
holding 9.9
human 9.7
interior 9.7
indoors 9.7
professional 9.5
room 9.5
casual 9.3
communication 9.2
executive 9.2
inside 9.2
transport 9.1
suit 9.1
dress 9
fun 9
together 8.8
standing 8.7
wheel 8.5
office 8.4
pretty 8.4
color 8.3
phone 8.3
indoor 8.2
outdoors 8.2
cheerful 8.1
hair 7.9
travel 7.7
spectator 7.7
adults 7.6
friends 7.5
world 7.5
leisure 7.5
20s 7.3

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

indoor 95.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-47
Gender Male, 98.1%
Angry 3.8%
Confused 9.2%
Disgusted 1.4%
Happy 4%
Surprised 5.3%
Sad 7.7%
Calm 68.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

HIN
HIN COien
COien