Human Generated Data

Title

Untitled (New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3144

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3144

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Overcoat 99
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
People 98.3
Person 96.1
Person 93.4
Adult 92.9
Male 92.9
Man 92.9
Person 92.9
Person 92.6
Person 91.6
Person 87.3
Person 86.6
Coat 84.4
Coat 77.3
Person 74.6
Person 72.5
Face 70.7
Head 70.7
Officer 67.5
Hat 64.3
Crowd 57.4
Walking 57.2
Funeral 56.4
Pedestrian 55.9
Formal Wear 55.6
Suit 55.6
Cap 55.1

Clarifai
created on 2018-05-10

people 99.9
many 99.4
group 98.9
group together 98.4
adult 97.6
administration 97
wear 93.7
leader 93.6
police 93.4
military 93
offense 92.6
woman 92.6
man 92
law 91
war 88.9
uniform 87.6
crowd 87.4
outfit 85
chair 84.6
funeral 84.4

Imagga
created on 2023-10-06

man 30.9
people 28.4
photographer 27.5
person 24.4
male 22
silhouette 21.5
world 20.7
brass 17.8
adult 17.7
men 17.2
mask 16.9
wind instrument 15.7
group 15.3
city 15
women 14.2
clothing 14
black 13.9
dark 13.3
street 12.9
business 12.7
uniform 12.1
protection 11.8
danger 11.8
sunset 11.7
human 11.2
sax 11.2
music 11.2
musical instrument 11
crowd 10.6
urban 10.5
dirty 9.9
trombone 9.8
destruction 9.8
walking 9.5
smoke 9.3
life 9.2
safety 9.2
industrial 9.1
soldier 8.8
protective 8.8
military 8.7
standing 8.7
light 8.7
gas 8.7
work 8.6
musician 8.6
device 8.5
portrait 8.4
window 8.2
pose 8.1
covering 8.1
night 8
businessman 7.9
radioactive 7.8
radiation 7.8
toxic 7.8
chemical 7.7
outdoor 7.6
sound 7.6
beach 7.6
stand 7.6
fashion 7.5
style 7.4
megaphone 7.4
helmet 7.4
occupation 7.3
lifestyle 7.2
building 7.2
gun 7.2
team 7.2
performer 7.2
military uniform 7.1
job 7.1
travel 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.9
outdoor 98.8
standing 92.1
group 91
people 90.2
line 17.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 100%
Sad 84.7%
Calm 60.4%
Surprised 6.3%
Fear 5.9%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Calm 69.5%
Sad 37.6%
Surprised 6.5%
Fear 6%
Confused 4.1%
Angry 0.9%
Disgusted 0.9%
Happy 0.6%

AWS Rekognition

Age 29-39
Gender Male, 99.6%
Calm 77.2%
Angry 10.2%
Surprised 7.2%
Fear 6.2%
Confused 3.9%
Sad 3.3%
Disgusted 1.6%
Happy 1.5%

AWS Rekognition

Age 20-28
Gender Male, 71.3%
Calm 64.9%
Sad 51.8%
Surprised 6.6%
Fear 6.2%
Angry 4.1%
Confused 1%
Disgusted 0.8%
Happy 0.5%

AWS Rekognition

Age 16-22
Gender Female, 91.4%
Sad 100%
Calm 7.3%
Surprised 6.4%
Fear 6.3%
Angry 1.1%
Disgusted 0.9%
Confused 0.5%
Happy 0.2%

AWS Rekognition

Age 20-28
Gender Male, 80.1%
Sad 97.3%
Calm 21.7%
Fear 10.8%
Surprised 8.6%
Angry 6.8%
Disgusted 2.5%
Confused 2.3%
Happy 1.1%

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Coat 84.4%
Hat 64.3%

Categories

Text analysis

Amazon

DEAD
ST