Human Generated Data

Title

Untitled (Civil Works Administration demonstration, New York City)

Date

December 1933-March 1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4243

Human Generated Data

Title

Untitled (Civil Works Administration demonstration, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

December 1933-March 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4243

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 98.8
Person 98.4
Person 98.2
Person 97.8
Person 97.4
Person 96.5
Person 95.8
Person 93.7
People 93.5
Person 92.6
Crowd 91.9
Person 91.3
Military Uniform 90.4
Military 90.4
Person 90.1
Parade 89.6
Person 89.4
Person 84
Person 82.1
Army 81
Armored 81
Person 78.7
Person 73.1
Person 72.1
Soldier 72
Officer 71.5
Person 66.1
Troop 64.6
Person 64.1
Person 63.2
Protest 58.5
Marching 56.4
Person 49.8
Person 47.8
Person 45.3
Person 45.1

Clarifai
created on 2023-10-25

people 99.9
street 98.8
group 97.6
many 96.9
group together 96.3
adult 96.2
war 94.4
woman 94.2
man 92.8
child 92.5
wear 91.9
soldier 89.3
portrait 88.3
boy 88.3
administration 86.9
train 86.6
military 86.5
railway 86.4
art 82.3
monochrome 82.1

Imagga
created on 2022-01-08

rifle 36.5
military uniform 33.8
gun 29.1
clothing 28.5
uniform 27.9
firearm 27.3
city 19.1
man 18.1
weapon 17.5
people 17.3
black 14.7
covering 14.6
protection 14.5
dirty 14.4
consumer goods 13.6
person 13.2
urban 13.1
building 12.8
adult 12.3
old 11.8
danger 11.8
industrial 11.8
destruction 11.7
accident 11.7
grunge 11.1
mask 10.8
male 10.6
military 10.6
gas 10.6
window 10.3
dark 10
soldier 9.8
disaster 9.8
protective 9.7
nuclear 9.7
men 9.4
musical instrument 9.3
wind instrument 9.2
silhouette 9.1
transportation 9
stalker 8.9
radioactive 8.8
camouflage 8.8
radiation 8.8
toxic 8.8
life 8.7
chemical 8.7
scene 8.6
architecture 8.6
business 8.5
art 8.5
human 8.2
shadow 8.1
water 8
device 7.8
travel 7.7
war 7.7
texture 7.6
typesetting machine 7.6
fun 7.5
smoke 7.4
world 7.3
portrait 7.1
women 7.1

Google
created on 2022-01-08

Black 89.6
Rectangle 84.6
Black-and-white 84.4
Style 83.8
Window 81.4
Art 80.8
Font 80.7
Tints and shades 76.6
Building 74.5
Monochrome photography 73.4
Blazer 73.4
Monochrome 71.3
Event 69.5
Suit 67.9
Room 66.7
Road 65.8
Street 63.3
City 63.3
Visual arts 62.6
Street fashion 62.4

Microsoft
created on 2022-01-08

text 97.5
person 95.9
clothing 95.4
man 84
group 62.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 100%
Calm 99.7%
Sad 0.1%
Fear 0.1%
Surprised 0%
Confused 0%
Happy 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 21-29
Gender Male, 89.3%
Calm 91.1%
Sad 5.6%
Surprised 1.3%
Fear 0.9%
Happy 0.5%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 94.4%
Sad 1.6%
Angry 1.3%
Fear 1.1%
Surprised 0.6%
Disgusted 0.4%
Confused 0.4%
Happy 0.3%

AWS Rekognition

Age 35-43
Gender Male, 53.1%
Angry 80.2%
Calm 13.5%
Fear 1.5%
Sad 1.4%
Surprised 1.3%
Happy 0.9%
Disgusted 0.7%
Confused 0.5%

AWS Rekognition

Age 24-34
Gender Male, 95.7%
Calm 63.6%
Angry 18.5%
Happy 12.4%
Disgusted 2%
Surprised 2%
Confused 0.8%
Sad 0.4%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Male, 98.7%
Sad 98%
Fear 1.1%
Calm 0.5%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Happy 0%

Feature analysis

Amazon

Person 98.4%

Categories

Text analysis

Amazon

CWA
CWA JOBS
JOBS
WANT
OPTIMO
OF
WE WANT
UNION
WE
UNION KM K
KM
TEDRO
ALAMIN
K