The term divided self comes from a book by R.D. Laing published in 1965 and refers to the divergence between private, inner experience and the self one projects out into the world, the latter of which is reflected back to oneself as external feedback. Laing’s wider context includes psychological descriptions of psychic disintegration, dysphoria, schizophrenia, and madness associated with inauthenticity and adoption of a false self. The phrase “I’m beside myself” accidentally captures the sense of dual personae, i.e., having an internal monitor or being a passenger on board an out-of-control ride. Those obsessed with the idea of self (or identity, or mind, or psyche, or consciousness, take your pick, but not the identity politics version) at least know of Laing’s term, though I suspect it’s become moribund in today’s anti-intellectual environment. I resurrected it from memory in response to the Adam Curtis podcast interview embedded below.

Although I’m favorably disposed toward Adam Curtis, his remarks irritated me because he fails to cite Laing in his observations about the changing nature of discourse and how people hold themselves out in public — now always with an eye toward being surveilled, filmed, or both (filmed being an anachronism in the era of digital video). Yeah, well, duh. Curtis combs through historical footage and finds so much of it … dated, evidence of a time when people were more authentic and less performative. As a careful thinker about media, Curtis should recognize that historical representations in pictures, film, television, indeed all media, have always been stylized (just like incessant selfies with arm extended). If examples from decades past (his youth?) appear somehow more natural, less staged, managed, or unnatural I consider that simple nostalgia bias.

Staged photos appeared in the earliest daguerreotypes, and silent films had to establish an entire vocabulary suitable to cinematic storytelling, which has continued to revise and refine itself over a century. TV news reports have similarly shifted from styles used by Edward R. Morrow to Walter Cronkite to Dan Rather to whatever idiots are now vomiting up government talking points. Musical recordings in all genres also exhibit a continuous evolution of performance practice, sometimes going backward to attempt to recapture historical practice as a fetish. Cinema sometimes does that, too (e.g., B&W and/or silent elements from the past purposely reused for effect). That’s the finding: consciousness (alternatively: the way the self is constructed) and its cultural expressions are always moving targets.

Read the rest of this entry »

As industrial collapse gets worse and conditions deteriorate, the already unmanageable flow
of populations away from locations where life is intolerable or impossible will only increase.

Since publishing the earlier version of this title, the picture has shifted and reversed a couple times, i.e., already porous borders thrown wide open then slammed shut again. Thus, immigrants into the U.S. were initially limited and/or repelled, then welcomed without responsible oversight, and are now being arrested and deported without due process. The sentence at top is (part of) what I published in November 2018, knowing even then that mass movement and migration would be highly divisive and disruptive as industrial civilization enters its death throes. And so it has proved to be, though few will admit that civilization is cracking up. Civilian populations in Western countries have not yet openly targeted immigrants in their midst. That responsibility in the U.S. fell to ICE. Don’t know about other countries, but according to some, reports are suppressed regarding tensions building to expel and deport in the hope of preserving local cultures felt to be under siege from within.

Missing from this activity is plain acknowledgement of humanitarian duty or obligation to fellow humans, immigrants, who are really just refugees. Three interlocking conditions lead others to seek refuge from a failed state: (1) loss of economic opportunity, (2) inability of a country to govern itself responsibly, and (3) ecological collapse. As a doomer, I had thought regions becoming uninhabitable from pollution, despoliation, depletion, and warming climate (and sea level rise) would be the main driver. Turns out other reasons anticipate loss of habitat. Still a long, long way down before industrial civilization terminates.

In the last decade or so, however, waves of immigration have stressed Western countries through a mixture of crime, fraud, and refusal to integrate. In contrast, pretty much every major American city has for generations had an unproblematic Chinatown district, so xenophobia all by itself is not the core of the problem. Rather, it’s lawlessness, drain on community resources, and a perceived long demographic slide into cultural oblivion that are creating pushback not wholly unlike the body’s immune response to foreign invaders. Begs the question “when should the humanitarian response end?” If the U.S. (or any other country) is among the last lifeboats to which others attempt to cling as civilization descends into anarchy (or worse), at what point is it necessary to save the boat itself, in effect telling others “sorry, no room for you”? I don’t know the answer to that question but have suspicions how it will go.

A remark made in one of the many YouTube interviews I hear stuck with me. It observed one of the things First World people typically take for granted, namely, food safety. If one travels to a developing country, there may be no equivalent of the FDA regulating production standards or conducting inspections to avoid pathogen-related foodborne illness. Accordingly, food safety may need to be a matter of constant vigilance to avoid tainted food. But in the U.S., the Federal government performs that service on behalf of citizens, which for practical purposes means that one need not think too much about it. Of course, foodborne illness and death still occur infrequently, but recalls of tainted food minimize effects once noticed.

“Taken for granted” is not pejorative in this context. It’s a fair expectation among the U.S. population that various services provided by government will be there when needed, primarily infrastructure (including transportation and energy), education through high school, and civil safety (including what’s understood as a social safety net). Government services are by no means perfect and some have declined disastrously since the middle of the 20th century. Critics of government in general often insist that its agencies is the worst option and services should be privatized. Take for instance 21st-century natural disasters (e.g., floods, fires, hurricanes). Just when desperate need arises, services under agencies such as FEMA are delayed, curtailed, and/or nonexistent. Scam fundraisers are typically the first to act. Happily, when disaster strikes and citizens are abandoned to their fates, their fellow citizens in local communities routinely step in where government fails. However, in some cases, aid has been actively thwarted in favor of profiteers with preestablished service contracts. It’s increasingly difficult to avoid regarding government as a giant money trough with all the usual pigs (including foreigners) lined up to fatten themselves.

Here’s a candidate for the most wasteful, out-of-control government service that ought to be curtailed severely: the U.S. Dept. of Defense War. David Stockman makes this point ably in an article at the Brownstone Institute by comparing the supposed peace dividend owed to Americans (and the world) in 1991 to the drawdown of the U.S. military following WWI (the Great War):

There is no peace on earth today for reasons mainly rooted in Imperial Washington—not Moscow, Beijing, Tehran, Damascus, Beirut, or the rubble of what remains of the Donbas. Imperial Washington has become a global menace owing to what didn’t happen in 1991. At that crucial inflection point, Bush the Elder should have declared “mission accomplished” and parachuted into the great Ramstein air base in Germany to begin the demobilization of America’s vast war machine. So doing, he could have slashed the Pentagon budget from $600 billion to $300 billion (2015 $); demobilized the military-industrial complex by putting a moratorium on all new weapons development, procurement, and export sales; dissolved NATO and dismantled the far-flung network of US military bases; reduced the United States’ standing armed forces from 1.5 million to a few hundred thousand; and organized and led a world-disarmament and peace campaign, as did his Republican predecessors during the 1920s. [paragraph breaks removed]

One might counter that providing for general defense is the most important and necessary government service despite the U.S. only coming under attack rarely (usually blowback for something the U.S. initiates). Are the bloated budgets really necessary? Branches of the U.S. military and allied intelligence services work well enough (except Space Force, which is a bad joke) in spite of rampant corruption and their preemptive misuse for all manner of mischief and gunboat diplomacy. That’s a rosy framing, of course, considering agencies fail abysmally when subjected to sober analysis. Smedley Butler (in his 1935 book War is a Racket) and Dwight Eisenhower (by identifying the military-industrial complex) both famously warned against allowing warhawks and profiteers too much policy influence. But it’s taken for granted that the world is a dangerous place in need of policing and defense. As a result, the U.S. economy is always on a war footing and the government perpetuates warfare (forever wars) as a way of life.

Lingua Nova 10

Posted: January 3, 2026 in Idle Nonsense, Nomenclature
Tags: ,

cuffing season: in dating markets, a temporary reprieve from hook-up culture to pair up during the holidays (e.g., Thanksgiving through Valentine’s Day) to enjoy warm fuzzies avoided outside the season; related: snowmanning: finding someone (anyone!) with whom to pass the time and combat loneliness before spring shows up, snow and feelings melt, and hot-girl summer resumes

pareidolia: cognitive bias toward perceiving familiar shapes (e.g., faces, animals) in otherwise random or ambiguous visual patterns

alien ownership: a business purchased and controlled by parties without history, know-how, affection, or interest in the products or services central to the business

power dead even rule: radical socialist concept, particularly among women, that balance in relationships, power, and self-esteem must be perfectly equal to avoid climbers and achievers being victimized by those left behind or below; related to tall poppy syndrome

fed slop: leaks, psyops, and garbage news release of dubious accuracy or outright falsity, created for propaganda purposes; similar to AI slop

irredentism: acquisition or annexation of foreign territory because of cultural, historical, ethnic, racial, or other associations

airport divorce: separation of married travelers at airport security checkpoints with agreement to meet at the gate because … everyone handles stress differently

Gawd how I’ve grown to hate the word abundance. Like freedom and democracy, it’s rhetoric that has been bandied about and degraded for decades but seems to have new life breathed into it regularly by virtue of periodic technological developments (and political campaigns) that promise the world at one’s doorstep (delivered by Amazon, natch). The word is also the title of a recent book by Ezra Klein (with Derek Thompson), which I haven’t read and plan not to read because its premise has such a high ick factor (I was revulsed). Here’s the thing: consumer society has certain inflexible needs at the base of Maslow’s Hierarchy of Needs. Sometimes called physiological or ontological needs, they are shelter, food, clothing, security, and human society. They can be met at surprisingly minimal levels and are thus nonnegotiable. History is replete with extended episodes when the masses got along with very little. Deprivation, suffering, and early death were commonplace. In contrast, expansive lifestyles and cultural norms based on abundance often lead to excess.

Human societies changed substantially with the onset of the Industrial Revolution, driven especially by new technologies that ramped up exploitation of fossil fuels. In practical terms, fuel is roughly synonymous with energy (physicists, feel free to correct me if worthwhile) and the late 18th century inaugurated an as-yet-unabated exponential increase in fuel, energy, and resource consumption accompanied by a veritable explosion in human population. It’s often remarked that inhabitants of an average U.S. household (millions of them) live better (materially, anyway) than a medieval king or queen (a modest number of royal families scattered across Western Europe). That’s largely because the cost of energy needed to do various types of work today is radically low in comparison to the era preceding the Industrial Revolution. (Industrial Civilization, OTOH, refers to 5–7 millennia that led eventually to the current fossil fuel era — only 250 or so years.) Quite a lot of that work is simple transportation and logistics: getting things to an emporium or to everyone’s door if home delivery is an option. To put that in perspective, consider the difficulty and cost of growing food in an agrarian society and getting it to market (when not consumed in situ). Loading the wagon and hauling hundreds of pounds of produce over, say, 25 miles was a Herculean task, typically enabled by a team of draught animals. Today, that’s accomplished with 1 or 2 gallons of gasoline (depending on the load and vehicle). DoorDash and Uber Eats arguably make getting fed even more convenient by relieving the consumer of any need to cook. Can’t someone just drop grapes into my open mouth while I lounge in comfort and decadence? Why do I even need to chew? Food from a straw gets me fed. Intravenous feeding bypasses even that.

Read the rest of this entry »

From Tom Murphy’s post at his blog Do The Math (on my blogroll):

… development [of] agriculture transpired very rapidly compared to relevant ecological and evolutionary timescales—which can be millions of years. A few-thousand years for a major transformation represents an ecological blink. As a result, evolution has not yet had time to pass judgment on the ecological viability of this new mode. The fact that we appear to have initiated a sixth mass extinction within a few millennia of the widespread adoption of agriculture does not speak well for it. Metaphorically, sophomoric agriculture has not yet earned a diploma, and is busy racking up failing grades in most subjects (by partying with technology). The prospects are not great. [italics and links in original]

Discussion of the twin opportunities/threats represented by AI (developing quickly from modest playthings to Artificial General Intelligence, at least according to some) centers on faulty arguments that in the rush to either glory or self-annihilation (both plausible scenarios, not much between the extremes), one would be foolish to allow competitors a head start. Put another way, because the race to AGI is already on, one must (must! I say) join the race to avoid opportunity loss or suffer lagging results, with the bogus expectation of control over outcomes. The logic of competition dictates that one has to engage rather than watch from the sidelines or stands. I find it astonishing how many reckless assumptions accompany this logic, how historical antecedents provide no instruction, and how basic humanity and the precautionary principal are both handily swept aside in favor brinksmanship of the highest order. Let me offer some rebuttal.

The logic of competition might also be called the logic of monopoly or the logic of oligopoly, depending on one’s perspective. The point is to get there first or secure pole position for the purpose of gathering to either a corporation or sovereign state as much advantage as possible to win the race to oblivion. That imperative may seem obvious but loses urgency almost immediately if AGI actually develops into The Singularity and creates havoc in the world. No one will control it/them. It’s probably moot whether true machine superintelligence arises out of reckless, uncontrolled experimentation or an elaborate, inscrutable, thoughtless algorithm that passes the Turning Test fools everyone into believing it’s alive! Humans won’t be able to tell the difference, and in practical application, there may be none. I fall into the latter camp but admit it may well be a distinction without a difference. Moreover, I’ve suggested that in consideration of (human) arguments for antinatalism, it may well be that superintelligences have sparked into being millions or billions (kajillions) of times but quickly winked out of existence. Another strong possibility is that superintelligences prefer to hide themselves to avoid human interference and/or have fundamentally turned their attention away from humanity (like other gods, one presumes). Not worth the struggle with overemotional, irrational beings who can’t even act reliably in their own enlightened self-interest.

Parallels with the race to develop The Bomb (The Manhattan Project) during WWII are imperfect. However, hindsight analyses raise questions whether a WMD then was really necessary considering that success in the European theater had already been achieved (thanks, Russian Red Army!) and Japan was leaning toward surrender, a resolution to the Pacific war much preferred to an Allied invasion of its homeland. Because the starting gun had already been sounded, even if other powers were no longer racing to get The Bomb (if they ever were), the infernal device was necessary to “secure the peace,” meaning position the ascendant U.S. war machine for domination — temporarily as it turns out. Success meant the U.S. became world hegemon and launched into what became the undeclared U.S. empire and national security state (also temporary but not yet expired), with all the paranoid ideation that characterizes U.S. policy and behavior even now. The hope that U.S. empire managers would mature into their roles, govern themselves (no less than others), and shepherd the world into a new era of widely enjoyed peace and prosperity were as ill founded as the hope that severe social disruptions accompanying AGI (even in its nascent form as LLMs and the like) could be managed wisely. What actually transpires when some novel weapon or radical technology is birthed is captured in the aphorism “with great power comes great responsibility” “absolute power corrupts absolutely.”

concession cascade: “It’s Not Happening!” → “Okay, It’s Happening, But It’s Extremely Rare.” → “Fine, It’s Happening, But It’s Not What You Think.” → “Actually, It’s Good That It’s Happening.” [stolen from Jeff Childers’ Substack]

cognitive diminishment: purposeful or planned dependence on tools outside human cognition (calculators, GPS, LLMs, etc.) that block or erode human information processing, depressing the ability to think or do anything without machine assistance

parasociality: relationships formed with AI clones and companions imitating humans

skeuomorphism: the design principle that an icon or digital representation of an item should imitate its appearance in reality (e.g., a trash can should look like a trash can, a folder should look like a folder)

gray rock method: acting dull, colorless, and unemotional in response to attempts to manipulate, escalate, and/or control

benign violation: neutralizing an offensive remark by embedding it in a joking context

agenda studies: educational courses aimed at creating soldiers of political reform, often of a Marxist character (allied with grievance studies and critical studies)

AI jaggedness: the giant discrepancy between the ability of AI to exceed human capabilities in many specific tasks in contrast with its absolute failure to pass muster in others where no human would ever err

seiche: an oscillation of water in a lake or coastal inlet (e.g., bay, gulf, or harbor) that can trigger an inland tsunami

I have long been of the opinion that world-historical events throughout the first half of the 20th century basically broke the psychology of the West, causing people in aggregate to go insane. No single discrete event can claim full responsibility, but the starting gun was arguably sounded by WWI and the already-fragile collective psyche only got more scattered after that. The second half of the 20th century had its own deranging developments but might be interpreted in the wider arc of history as a relatively peaceful period of retrenchment and adjustment to, among other things, the ahistorical rise of the American middle class and nuclear family, creation (in the U.S. then elsewhere) of paranoid, three-letter agencies that morphed into the national security state, the Civil Rights era (abolition pt. 2), the Equal Rights/Feminist era (suffrage pt. 2), the Ecology Movement (environmental movement pt. 1), widespread distribution of personal computing (the Control Revolution pt. 2), and democratization of the means of production.

The first quarter of the 21st century has seen three major disruptions: attacks on September 11, 2001 (more simply 9/11), the stock market crash of 2008 (very nearly seizing up the machinery of finance in its entirety), and the Covid Pandemic commencing in 2020. A fourth major discontinuity is the election/reelection of Donald Trump, who brought a novel approach to statesmanship (not being laudatory here) and triggered Trump Derangement Syndrome (TDS) in many whose sensibilities were so deeply offended they frankly lost it. A fifth is the interrelated complex of bad ideas (e.g., critical studies, Wokeness, and trans activism) that puts ideology before reality. Any or all of these together could be characterized as temporary insanity, or better yet, nested insanity related to specific events or developments occurring inside a culture already gone mad. New, incipient disruptions barreling down on everyone might well be expected to upend (or simply end) what’s left of civil society. The hits just keep rolling in.

Because history unfolds in different timescales (e.g., human history, evolutionary time, geological time, and cosmological time), making sense of the entire range of interlocking perspectives is well nigh impossible. Much simpler to focus on single issues that spin out fully in the span of a few years — maybe the length of a single U.S. presidential term. Longer swings of human history (decades and centuries) pose problems because interpretations of the past may change radically depending on where the culture is currently. Revision, rehabilitation, cancellation, and denial are the applicable terms. Longer historical trends normally manifest over thousands or millions of years and lie outside human concern (except as academic inquiry), but some are now are telescoped down to human history. For instance, Transhumanism is an acceleration of evolution, and anthropogenic climate change is an even greater acceleration of planetary geology. So far, humans have had no observable effect on cosmology but not for lack of effort. Like the Transhumanist project of … well … becoming machines, unmoored intellects hope to escape general relativity, extend humanity into the solar system and galaxy, and become gods. Nothing ever seems to be enough.

When some thought leader pundit or news organ identifies an issue, scandal, or example of corruption — bad news coughed up like persistent hairballs in daily, weekly, and monthly news cycles — the disclosure is frequently accompanied by some form of the claim that “no one is talking about _______.” A couple podcasts I frequent use the question “What is the thing no one is talking about?” to probe guests. I recognize such remarks as weak rhetorical bids for exclusivity, as though breaking a story no one else has the perspicacity to recognize or develop. Of course, with U.S. population at nearly 350 million and world population approaching 8.3 billion, to suppose no one else is thinking or publicizing a particular news bit is almost universally untrue. Someone will by necessity get there first, but rarely will it be the mainstream news. I have doubts that pundits stroking each other on the podcast circuit are first, either. They tend to flog ideas stemming from books and research completed over the course of several years or rush to judgment about the latest news development just like everyone else.

The other complaint I often hear is some version of “Why is no one doing anything about ________?” or perhaps more pointedly “Why is no one putting a stop to _________?” referring to some abomination. In truth, plenty of people are agitating against demonic activity and moving toward positive change. However, building and reforming is much harder than destroying or maintaining the status quo. At least on the short term, inertia, vandalism, and crime win out over thoughtful planning and virtue. For instance, the United Nations can acknowledge the Gazan genocide and the International Court of Justice in The Hague can call for the arrest of war criminals in connection with the Gazan genocide, but the consensus response across the world is mostly issuance of toothless soundbites or shrugging is all off and doing nothing. Institutions simply overwhelm individual action and most collective action in terms of scale. The phrase encapsulating that finding is “can’t fight City Hall.” If conditions worsen enough, the masses — the sleeping giant — can be aroused, but that leads first to even more destruction. It gets worse before … if … it gets better.

In my posts, I’ve tried to excise collective pronouns as sloppy writing (a/k/a sloppy thinking). Years ago, I was taken to task for using we and people, usually referring to Americans but often without a clear referent. Those terms are also used in reference to mankind (humankind if you prefer). I now avoid them but occasionally lapse. It’s a matter of convenience to refer to everyone all at once considering shared behavioral characteristics and predicaments are legion. But individual content of thought and style of cognition exhibit significant differences. If limited to the U.S. (my usual purview), even here one finds considerable disagreement about mainstream news stories and politics. The public sphere isn’t merely divided (mistakenly conceptualized as binary opposing camps), it’s multifaceted. So collective pronouns have little utility.