America Is Fascist
•
It happened here. The United States is officially a fascist nation. Leftists, in the 1960s and 1970s, loved to throw around the F word. Johnson was fascist, Nixon was fascist, Amerikkka was fascist, the cops were fascist. As a history student and the son of a woman who lost her childhood to the objectively fascist collaborationist government of Nazi-occupied France, I could not ignore the chasm between the systemic racism and authoritarian tendencies of the police in this country and the goose-stepping militaristic mania of Fascist Italy and Hitler’s Germany. As I passed into adulthood and middle age, however, the United States kept moving Right—under presidencies of both political parties. As power increasingly concentrated in the clutches of the executive branch, the Pentagon launched ever more wars of choice with fewer attempts at justification and the most extreme dystopian nightmares morphed into banal reality, liberalism and leftism became distant memories. Right-wing extremism became normalized in culture, then in the press,…
Read More