Human Generated Data

Title

Untitled (crowds of people in Penn Station)

Date

1951

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7620

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowds of people in Penn Station)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7620

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Person 99.4
Person 99.3
Person 99.3
Building 98.2
Person 98
Urban 97.7
City 97.7
Town 97.7
Downtown 97.7
Person 96.9
Person 96.1
Person 94.9
Architecture 93.7
Person 92.2
Person 91
Pedestrian 87
Person 83.4
Person 82.1
Person 81
Crowd 80.1
Person 79.6
Interior Design 76.7
Indoors 76.7
Stage 75.8
Leisure Activities 75.3
Person 73.1
Dance Pose 72.9
Poster 67.8
Advertisement 67.8
Person 66.6
Boardwalk 64.9
Bridge 64.9
People 60.7
Person 46.4

Clarifai
created on 2023-10-25

people 99.8
many 98.5
group 96.2
man 96.2
crowd 96.2
adult 92.3
wear 91.2
woman 89.1
group together 88.2
audience 81.8
leader 81.3
religion 79.6
child 79.5
ceremony 78.6
administration 78.5
indoors 77.6
furniture 74.6
spectator 74.1
art 73.6
monochrome 72.7

Imagga
created on 2022-01-08

stage 76
platform 61.7
city 33.2
architecture 28.9
travel 26
hall 23.4
building 22.3
urban 19.2
sky 19.1
night 16
tourism 15.7
landmark 15.3
landscape 14.9
structure 14.7
water 14
crowd 13.4
old 13.2
famous 13
panorama 12.4
cityscape 12.3
freight car 12.2
sea 11.7
people 11.7
ocean 11.6
downtown 11.5
car 11.2
flag 11.1
street 11
light 10.7
vacation 10.6
skyline 10.4
bridge 10.4
historic 10.1
tourist 10
athletic facility 9.7
construction 9.4
culture 9.4
vehicle 9.1
center 9.1
history 8.9
group 8.9
facility 8.7
wheeled vehicle 8.7
scene 8.6
business 8.5
historical 8.5
gymnasium 8.4
town 8.3
lights 8.3
lake 8.2
park 8.2
transport 8.2
national 8.1
palace 8.1
coast 8.1
transportation 8.1
holiday 7.9
ancient 7.8
houses 7.7
house 7.7
buildings 7.6
journey 7.5
destination 7.5
symbol 7.4
square 7.2
road 7.2
horizon 7.2
mountain 7.1
summer 7.1

Google
created on 2022-01-08

Font 81.4
Adaptation 79.3
Event 74
Monochrome photography 72.5
Art 69.5
Rectangle 69.2
Monochrome 68.1
Room 67.5
Paper product 66.6
History 66.3
Crowd 65.4
Visual arts 65.1
Holy places 59.7
Pole 54.9

Microsoft
created on 2022-01-08

text 99.6
person 97.4
dance 93.8
clothing 77.2
people 60

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Calm 97.7%
Sad 2.1%
Angry 0%
Happy 0%
Confused 0%
Fear 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 22-30
Gender Male, 98.7%
Calm 20.6%
Sad 18.4%
Angry 18.1%
Confused 16%
Happy 15.7%
Fear 5.1%
Surprised 3.6%
Disgusted 2.5%

AWS Rekognition

Age 25-35
Gender Male, 98.3%
Calm 39.2%
Happy 31.6%
Disgusted 16.3%
Angry 4%
Sad 2.9%
Surprised 2.6%
Fear 1.9%
Confused 1.5%

AWS Rekognition

Age 21-29
Gender Male, 87.7%
Confused 74.8%
Calm 14.2%
Surprised 2.9%
Happy 2.2%
Sad 2.1%
Disgusted 1.7%
Angry 1.3%
Fear 0.8%

AWS Rekognition

Age 18-26
Gender Female, 51.6%
Calm 87.6%
Happy 5.8%
Sad 4.4%
Angry 0.7%
Fear 0.5%
Disgusted 0.4%
Surprised 0.4%
Confused 0.1%

AWS Rekognition

Age 16-22
Gender Male, 97.7%
Angry 85.4%
Calm 9%
Disgusted 1.7%
Sad 1.5%
Confused 1%
Happy 0.6%
Fear 0.6%
Surprised 0.2%

Feature analysis

Amazon

Person 99.6%
Poster 67.8%

Categories

Text analysis

Amazon

OEEW
69 OEEW
69