Human Generated Data

Title

Untitled (demonstrators, New York City)

Date

1934-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2992

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (demonstrators, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1934-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2992

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 99.8
Coat 99.8
Walking 99.5
People 99.2
Male 99.1
Man 99.1
Person 99.1
Adult 99.1
Male 98.6
Man 98.6
Person 98.6
Adult 98.6
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Adult 97.6
Male 97.6
Man 97.6
Person 97.6
Person 97.4
Adult 96.7
Male 96.7
Man 96.7
Person 96.7
Adult 96.6
Male 96.6
Man 96.6
Person 96.6
Person 96.2
Person 93.3
Adult 91.8
Male 91.8
Man 91.8
Person 91.8
Footwear 88.3
Shoe 88.3
Person 83
Person 78.6
Overcoat 76.6
Shoe 75.2
Shoe 73.9
Hat 71.8
Shoe 71.3
Pedestrian 68
Face 67.4
Head 67.4
Shoe 61.9
Hat 58.4
Shoe 57.9
Crowd 56.7
Accessories 55.5
Bag 55.5
Handbag 55.5
Shoe 55.1

Clarifai
created on 2018-05-10

people 99.8
many 98.7
group together 98.5
group 97
adult 95.9
administration 95.8
wear 94.7
man 94
street 92.5
military 92.4
crowd 89.8
woman 89.8
war 88.1
child 87.5
vehicle 85.5
uniform 84.8
recreation 84.7
police 83.7
outfit 82.1
monochrome 81

Imagga
created on 2023-10-05

people 35.1
city 33.3
street 31.3
man 29
crowd 26.9
urban 26.2
person 24.5
pedestrian 23.5
walking 22.7
motion 22.3
business 21.9
adult 21.8
walk 21
men 20.6
male 19.1
rush 17.7
blurred 16.3
blur 15.8
sidewalk 15.7
women 15
clothing 14.9
life 14.4
bag 13.7
group 13.7
businessman 13.2
human 12.8
travel 12.7
move 12.5
crutch 12.3
legs 12.3
rushing 11.9
stick 11.5
suit 11.5
world 11.5
staff 11.5
shopping 11
station 10.8
briefcase 10.6
fashion 10.6
black 10.5
scene 9.5
corporate 9.5
uniform 9.4
day 9.4
clothes 9.4
window 9.2
leg 9.1
passenger 8.9
airport 8.8
shoes 8.6
work 8.6
active 8.5
shop 8.4
hand 8.4
safety 8.3
hall 8
building 8
commuter 7.9
leaving 7.9
subway 7.9
hour 7.8
feet 7.7
weapon 7.7
jeans 7.6
casual 7.6
movement 7.5
leisure 7.5
silhouette 7.5
tourist 7.3
occupation 7.3
speed 7.3
girls 7.3
protection 7.3
looking 7.2
worker 7.1
job 7.1
architecture 7
indoors 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.8
outdoor 92
group 85.3
people 73.8

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Surprised 97.9%
Calm 10.7%
Fear 9%
Sad 2.6%
Angry 2.3%
Confused 1.7%
Disgusted 0.8%
Happy 0.3%

AWS Rekognition

Age 23-31
Gender Male, 86%
Happy 82.8%
Surprised 7.3%
Fear 6.7%
Calm 6.1%
Sad 3%
Confused 2.6%
Disgusted 1.1%
Angry 1.1%

Microsoft Cognitive Services

Age 6
Gender Female

Feature analysis

Amazon

Male 99.1%
Man 99.1%
Person 99.1%
Adult 99.1%
Shoe 88.3%
Hat 71.8%

Text analysis

Amazon

ARTIST
artists
I
ST
A
LAYE
BED
..
pees
need
AFFERIO

Google

トER ertist
ER
ertist