Human Generated Data

Title

Untitled (crowds of people in Penn Station)

Date

1951

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7619

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowds of people in Penn Station)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Downtown 99.9
Building 99.9
City 99.9
Urban 99.9
Town 99.9
Person 99.4
Human 99.4
Person 99.1
Architecture 99.1
Person 98.9
Person 98.6
Person 98.1
Person 98
Person 97.3
Person 95.7
Person 95.3
Person 93.1
Person 92.2
Poster 86.3
Advertisement 86.3
Pedestrian 85.4
Person 84.9
Crowd 71.3
Leisure Activities 69.5
Town Square 69.5
Plaza 69.5
Metropolis 69.1
Person 68.9
Person 62.1
Dance Pose 59
Office Building 56.8
Arch 55.2
Arched 55.2
Arena 55.1
Person 45.5

Imagga
created on 2022-01-08

stage 87.3
platform 69.8
city 43.2
architecture 41.5
building 34.5
urban 27.9
travel 26.7
hall 26.2
landmark 26.2
cityscape 21.8
tourism 20.6
night 20.4
skyline 19.9
water 19.3
sky 17.8
structure 17.3
famous 16.7
tourist 16.4
downtown 16.3
bridge 16.3
town 15.8
center 15.6
old 15.3
river 15.1
ocean 15.1
sea 14.8
panorama 14.3
boat 13.9
historic 13.7
tower 13.4
light 12.7
modern 12.6
history 12.5
port 12.5
vacation 12.3
lights 12
ship 11
coast 10.8
landscape 10.4
buildings 10.4
bay 10.4
station 10.4
business 10.3
construction 10.3
street 10.1
panoramic 9.6
scene 9.5
historical 9.4
destination 9.4
palace 9
summer 9
people 8.9
athletic facility 8.8
boats 8.7
skyscraper 8.6
culture 8.5
office 8.5
facility 8.3
transport 8.2
waterfront 8.2
day 7.8
pacific 7.7
house 7.7
crowd 7.7
vintage 7.4
church 7.4
new 7.3
holiday 7.2
theater 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.7
dance 88.3
person 80.4

Face analysis

Amazon

AWS Rekognition

Age 20-28
Gender Male, 96.7%
Calm 90.4%
Sad 3.4%
Angry 3.2%
Fear 1.3%
Happy 0.6%
Disgusted 0.6%
Confused 0.3%
Surprised 0.2%

AWS Rekognition

Age 24-34
Gender Male, 54%
Fear 56.3%
Happy 18.2%
Sad 9.9%
Calm 8.9%
Angry 3.9%
Confused 1.5%
Surprised 0.7%
Disgusted 0.7%

AWS Rekognition

Age 21-29
Gender Male, 97%
Calm 58.7%
Sad 11.3%
Happy 8.9%
Surprised 7.6%
Angry 5.5%
Fear 4%
Confused 2.3%
Disgusted 1.7%

AWS Rekognition

Age 18-26
Gender Male, 96%
Calm 86.4%
Angry 7.5%
Sad 2.3%
Fear 1.4%
Happy 1%
Confused 0.7%
Surprised 0.3%
Disgusted 0.3%

AWS Rekognition

Age 12-20
Gender Female, 93.6%
Calm 61.2%
Sad 23.3%
Fear 3.7%
Confused 3.5%
Disgusted 2.5%
Happy 2.3%
Angry 2.1%
Surprised 1.4%

Feature analysis

Amazon

Person 99.4%
Poster 86.3%

Captions

Microsoft

a group of people standing in front of a building 88.5%
a group of people standing on top of a building 84%
a group of people standing outside of a building 83.9%

Text analysis

Amazon

7
M330
M330 70.
70.

Google

M3 30 •7.
M3
30
•7.