Control
Machine, Animal...Society?
While digitization as a technological transformation may appear to be the subject of this project, I am interested in digitization mainly because I am interested in its effects on social control. I am not writing a history of technology. I am tracking the progress of digitization as a political economist interested in its effects on societal governance.
Networked computing is a technology of communication and control. For those who don’t know already, I am invoking Norbert Wiener’s definition of cybernetics, which he called “the science of communication and control in the animal and the machine.“ Wiener is one of the founders of digitization, and one of the most thoughtful about its social implications. His encounter with the digital came from helping the U.S. military find a way to control anti-aircraft guns during World War 2. The attempt to predict both the movement of the airplane and trajectory of the bullets required a scientific method of controlling the operation of a complex interactive system. This experience, and his exposure to and co-development of information theory with Claude Shannon, led Wiener to conclude that communication and feedback-induced adjustments were central to the exercise of control.
Cybernetics thus became “the science of communication and control in the animal and the machine.” According to Wiener, it is “the study...of the effective messages of control.” “Its name,” he said, “signifies the art of pilot or steersman.” Wiener and many of his contemporaries – including a growing circle of mathematicians, systems theorists, philosophers, biologists, and psychologists – implied that their theory could be applied to describe any system.
Did this include social systems?
Wiener himself was not so clear about this. He wrote that “society can only be understood through a study of the messages and the communication facilities which belong to it,” and predicted (correctly) that the “future development of these messages and communication facilities, messages between man and machines, between machine and man, and between machine and machine, are destined to play an ever-increasing part” in society. But does understanding these communicative relations give us a technology for controlling society? In his 1964 book God and Golem, (pp 87, 90) he said that the control and communications science applicable in engineering and physiology were also applicable in sociology and economics (while complaining that the math economists use “are the mathematics and mathematical physics of 1850”). The British psychiatrist Ross Ashby seemed more confident that social systems were the equivalent of machines and animals. He wrote in his 1956 book, Design for a Brain, that the study of cybernetics “is likely to reveal a great number of interesting and suggestive parallelisms between machine and brain and society.”
Machine, brain, society. As objects of control, are they equivalent?
How we answer that question matters. Scientific understanding of a process generally produces technologies that give humans the power to harness and control it. Applied physics and chemistry, for example, give us the ability to send rockets into space. If cybernetics provides truly scientific insights into the operation of social systems, it should also provide us with the ability to engineer society. Digital technology has given us the ability to design and manage robots, computers, and telecommunications networks. Does this also mean it can give us a technology of managing, organizing and controlling social systems, which necessarily means controlling people as components of those systems?
This project will argue that human society is neither a machine nor an animal. Thus, it requires a radically different understanding of “control.” To model the steering of human society as a “spaceship” with a “pilot,” as Buckminster Fuller famously did in the late 1960s, is not just wrong factually, but misguided. Societal steering is not done by a “steersman” but by many human agents, with different interests and autonomous wills. Humans can have both cooperative and conflicting ideas about who should exercise control and what should be done with it.
Wiener’s original definition of cybernetics only referred to animals and machines. He never explicitly framed it as a technology of social control. Indeed, in his greatest book, The Human Use of Human Beings, he rejected top-down hierarchies in which “all orders come from above, and none return.” He wrote: “I wish to devote this book to a protest against this inhuman use of human beings.” He must have sensed that decisions about the control of society, of people, are of a different type than controlling machines - that feedback from the “controlled” people should also be controlling.
Is there a cybernetics of society?
Until now, social science applications of cybernetics have had nothing coherent to say about the status of social systems as an object of control. During the 1960s, some cybernetically-informed ideas were applied to organizations, that is, to the management of businesses, with limited success. Stafford Beer’s and Jay Forrester’s System Dynamics modeled stocks and flows in supply chains and feedback loops in organizational decision making. Herbert Simon wrote in the early 1970s that “The terms ‘operations research’ and ‘management science’ are nowadays used almost interchangeably to refer to the application of orderly analytic methods, often involving sophisticated mathematical tools, to management decision making and particularly to programed decision making.” Ultimately, however, the application of computing and systems analysis were just analytical techniques used by management consultants. It did not usher in a new age of scientific management, much less societal governance.
An organization is just one social unit and is supposed to have a well-specified objective; in that respect, it is more like a humanly-designed machine than a human society. “Society” does not have a single objective; it is a system of interaction that involves many organizations and many individuals with different goals, preferences and purposes. It does have an order, but it is an emergent one: billions of individual agents with potentially incompatible methods of interacting cooperate, coordinate, compete and conflict. Societies must generate public, not just private (firm-level) governance.
Most attempts to apply cybernetics to this higher level – to the truly social – have been practical failures if not outright threats to human freedom.
The Technocracy movement, born out of the crises and ideological fervor of the Great Depression era, had hundreds of thousands of members and chapters all over the U.S. and Canada in the 1930s and ‘40s. The Technocrats despised market economy and democracy both. Inspired by anti-capitalist thinkers like Thorsten Veblen, they believed engineers should be the ruling class, and authorized to perform “the scientific operation of the entire social mechanism.” Exposed to cybernetics in the 1950s, the Technocrats became excited about the way it visualized “the possibility of an ‘automatic social mechanism’ that would make all political and economic decisions.” Society was, to them, a machine.
Orthodox Marxian socialists also embraced systems-thinking and its potential for centralized top-down governance. Oskar Lange, a Polish economist best known for his work on The Economic Theory of Socialism (1936), argued that central planners could use the mathematical tools of neoclassical economics to replace markets. No need for individual humans to buy and sell on their own; the government could allocate resources efficiently using Walrasian equilibrium equations. Lange’s 1970 book, Introduction to Economic Cybernetics, invoked Wiener and asserted that “the principles of automatic regulation in technical equipment” are the same as “the regulation and control of social and economic processes.” Digital technology, he believed, gave us the tools to consciously control and design the economy.
A similar vision motivated Stafford Beer’s ill-fated attempt to run the Chilean economy from a central computing hub for the socialist government of Salvador Allende. Beer’s collaboration with Chile’s elected Marxist leader in the 1970s tried to extend cybernetics to the governance of a national political economy. The project, known as CyberSyn, short for “Cybernetic Synergy,” failed to centralize the data needed even to monitor, much less control, Chile’s economy. A detailed book about the CyberSyn project by Eden Medina concluded that “Real-time data processing encompassing most of the economy remained a pipe dream,” something that F.A. Hayek could have told them 30 years before.
Beer’s stab at cybernetic politics was even less complete. A project known as “CyberFolk,” inspired by Allende’s idea of power emanating from “the people,” proposed to enable citizens to provide political feedback to indicate their level of satisfaction or dissatisfaction with government policies or televised speeches. In essence, it envisioned the installation of an “algedonic meter” - a 1970s version of a Facebook “Like” button - in every home. Though perhaps sincere in its democratic aspirations, Cyberfolk’s “Unhappy/Happy” dial was based on a naive equation of mechanized feedback with the human nervous system. It completely missed the point: how did these social “nerves” control the muscles, the state? How, in other words, would citizens’ twists of the algedonic dial be translated into political action, laws, institutions? What rules would govern the collection and utilization of this data? How binding would the results be on government officials and politicians? How would these collectively aggregated preferences be translated into law or some other form of government action?
In his 2002 book Machine Dreams: Economics Becomes a Cyborg Science, Philip Mirowski describes what he considers to be the cybernetic transformation of the social sciences in the post-WW2 era. He documents (in excruciating detail) an intellectual movement in economics to redefine the human subject as a computational processing unit, a turn towards what he calls “cyborg science.” He argues that the economy began to be viewed not as a field of human exchange, but as a giant, distributed machine for processing information, in which the distinction between human agency and mechanical calculation disappears. He cites the role of the Cowles Commission, the RAND Corporation, and the Office of Naval research (ONR) in supporting and funding this transformation. The Cowles Commission spearheaded the movement of economics into a formalistic, mathematical discipline. RAND in the 1950s and ‘60s brought together mathematicians, physicists, and economists to work on problems of game theory and rational choice in nuclear warfare, which were then imported directly into civilian economic theory. The ONR funded the early development of linear programming and operations research, which provided the tools to treat the economy as an engineering problem. Mirowski emphasizes that this was not just an intellectual shift, but one driven by the logistical and financial needs of the Cold War military-industrial complex. These tools were supposed to allow for the management of large-scale, complex systems like the military supply chain and even the national economy. Mirowsky argues that (in theory, not so much in practice) the result was the replacement of the “human” in economics with an “automaton,” an efficient information processor.
The common flaw in all these applications of cybernetics is that they modeled society as if it were a machine or organism and failed to take human individuality and human agency - both individual and collective - into account. They ignored the simple fact that the basic components of society – people – can act autonomously as individuals, and can learn and communicate with each other to coordinate action and cooperate in production. They can, both individually or collectively, compete or engage in conflict over their differing goals and values. Society is not a machine, and government is not the brain of a biological organism.
Still, social institutions and organizations, both cooperative and competitive, do hinge on many forms of communication and feedback, and ICT technologies play a starring role in any system of social ordering.
A true cybernetics of society is yet to be developed. And yet, as digitization proceeds, it is essential that we have one.




Can a cybernetic society still leave room for irreducibly human qualities like ambiguity, conscience, dissent, and interior life? Otherwise, once governance runs through signals, scores, and feedback loops, we risk treating people less like citizens and more like nodes in a control system.
Your conclusion - "a true cybernetics of society is yet to be developed" - may point to something beyond cybernetics altogether. Karl Deutsch himself described three models of thinking: mechanical, organic, cybernetic. Each assumed a steersman. What if the fourth model isn't a better cybernetics of society, but a model where coordination happens through the exchange of knowledge itself - without a steersman? Not control through feedback, but value through communication that produces understanding neither participant had before.