Human Generated Data

Title

Untitled (crowds of people in Penn Station)

Date

1951

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7618

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (crowds of people in Penn Station)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7618

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.9
Human 98.9
Person 98.6
Person 98.5
Person 98.3
Person 96.6
Person 95.8
Person 95.7
Person 95.1
Person 94.7
Person 88.9
Person 87.4
Crowd 82.8
People 80.4
Audience 76
Clothing 75.6
Apparel 75.6
Indoors 72.4
Room 72.2
School 71.7
Stage 70.1
Person 69
Person 67.8
Person 67.5
Classroom 65
Person 64.5
Overcoat 61.8
Suit 61.8
Coat 61.8
Person 59.6
Photography 59.1
Photo 59.1

Clarifai
created on 2023-10-25

people 99.9
many 99.2
group 99.1
woman 98.5
adult 98
man 96.5
group together 94.1
wear 93.4
leader 92.8
ceremony 92.4
child 91.4
crowd 90.9
administration 85
gown (clothing) 84.2
wedding 82.4
religion 79.7
veil 79.4
dress 79.2
several 78.9
clergy 76.4

Imagga
created on 2022-01-08

people 27.3
metropolitan 22.2
building 18.7
city 17.4
hall 17.3
clothing 17.2
vestment 15.8
architecture 15.6
room 15.4
old 15.3
business 15.2
man 14.8
travel 14.1
shop 13.6
interior 13.3
gown 13.2
window 13.2
adult 12.8
women 12.6
spectator 12.3
tourism 11.5
men 11.2
student 11
tourist 10.9
history 10.7
indoors 10.5
lamp 10.5
group 10.5
person 10.4
life 10.4
inside 10.1
street 10.1
historic 10.1
outerwear 10.1
male 9.9
modern 9.8
urban 9.6
crowd 9.6
classroom 9.6
couple 9.6
office 9.4
monument 9.3
house 9.2
religion 9
catholic 8.8
home 8.8
day 8.6
wall 8.5
church 8.3
fashion 8.3
indoor 8.2
boutique 8
love 7.9
work 7.8
art 7.8
table 7.8
professional 7.8
counter 7.7
chair 7.6
historical 7.5
buy 7.5
shopping 7.3
enrollee 7.3
family 7.1
uniform 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99
person 90.2
clothing 84.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Female, 66%
Happy 52.8%
Calm 27%
Sad 16.3%
Angry 2.6%
Fear 0.3%
Surprised 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 50-58
Gender Female, 74.1%
Calm 97.6%
Sad 1.3%
Fear 0.6%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 11-19
Gender Female, 93.6%
Sad 28.5%
Disgusted 27.3%
Angry 14.7%
Calm 9.4%
Happy 7.7%
Surprised 5.9%
Fear 3.8%
Confused 2.6%

AWS Rekognition

Age 19-27
Gender Female, 85.1%
Calm 83.9%
Surprised 6.8%
Sad 2.9%
Angry 2.1%
Disgusted 1.2%
Fear 1.2%
Happy 1%
Confused 1%

AWS Rekognition

Age 16-24
Gender Male, 68.4%
Calm 84.1%
Sad 7.4%
Confused 2.6%
Angry 2.1%
Disgusted 1.2%
Surprised 1.1%
Fear 0.8%
Happy 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Text analysis

Amazon

TRAINS
STAIRS
FOR
DOWN STAIRS FOR
INCOMING
M330
DOWN
M330 71.
71.
LOW
SEVENTIA