Human Generated Data

Title

Penn Station

Date

c. 1956, printed later

People

Artist: Louis Stettner, American 1922 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.565

Copyright

© Louis Stettner Estate

Human Generated Data

Title

Penn Station

People

Artist: Louis Stettner, American 1922 - 2016

Date

c. 1956, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Handrail 99.7
Banister 99.7
Person 98.8
Human 98.8
Apparel 95.8
Clothing 95.8
Person 94.4
Person 92.9
Railing 87.4
Pedestrian 82.7
Helmet 78
Hat 72.8
Finger 67
Sleeve 55.6

Clarifai
created on 2018-02-10

people 99.7
monochrome 98.7
street 98.6
adult 97.9
woman 96.4
man 96.2
one 95.9
two 95
group together 93.6
administration 93.3
portrait 92.2
police 91.4
group 90.9
city 89.3
wear 88.6
recreation 87.5
three 87.2
offense 85.6
veil 83
several 82.6

Imagga
created on 2018-02-10

man 36.3
people 25.7
turnstile 25
male 23.4
equipment 22
gate 21.6
device 19.6
adult 19.4
training 18.5
person 18.4
men 16.3
hat 15.8
sport 15.5
two 15.2
city 14.1
strength 14
helmet 13.6
safety 12.9
building 12.7
worker 12.7
occupation 11.9
power 11.7
barrier 11.7
business 11.5
job 11.5
urban 11.4
work 11
exercise 10.9
lifestyle 10.8
black 10.8
transportation 10.8
industrial 10
fitness 9.9
modern 9.8
gun 9.6
weight 9.4
clothing 9.3
protection 9.1
body 8.8
weapon 8.7
exercising 8.7
gym 8.6
model 8.6
professional 8.5
hand 8.4
health 8.3
human 8.2
machine 8.1
uniform 7.9
face 7.8
portrait 7.8
jacket 7.7
attractive 7.7
industry 7.7
fashion 7.5
street 7.4
team 7.2
women 7.1

Google
created on 2018-02-10

white 96.3
photograph 96.1
black 95.9
man 95.3
standing 94.6
black and white 93.6
person 92.4
sitting 89.9
monochrome photography 88.7
photography 84.7
car 83.2
male 82.1
snapshot 81.8
monochrome 74.5
human 70.2
darkness 61.8
gentleman 61.8
human behavior 60.3
film noir 58
angle 54.9

Microsoft
created on 2018-02-10

person 98.9
outdoor 87.4

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 70.9%
Happy 0.2%
Disgusted 0.2%
Surprised 0.3%
Confused 0.1%
Calm 98.4%
Sad 0.4%
Angry 0.3%

Feature analysis

Amazon

Person 98.8%
Helmet 78%
Hat 72.8%

Captions

Microsoft

a person sitting on a bench 62.7%
a person standing on top of a bench 62.1%
a person that is sitting on a bench 56.4%