The Assumptions That Never Change
Agile failed. Now AI is failing. And the reason, in both cases, is the same one that killed Lean manufacturing in Detroit, buried ERP implementations, and let newspapers watch the internet destroy them. Management refuses to update its beliefs.
There is a graveyard of methodologies and technologies that should have transformed organisations but didn’t. Their tombstones share an epitaph: adopted in form, rejected in spirit.
The latest two headstones are being carved right now. Agile — the software development philosophy that promised adaptive, collaborative, human-centred work — became, in most organisations, a peculiar regime of standups, story points, and Jira tickets that somehow produced more bureaucracy than the waterfall processes it replaced. And AI — the most powerful productivity technology in a generation — is being procured, announced, and quietly abandoned at a rate that might alarm anyone paying attention.
Both failures have a familiar shape. The tools were bought. The consultants were hired. The frameworks were installed. And then nothing fundamentally changed, because the people at the top never changed the one thing that needed to change: what they believed.
‘The organisation imports the visible artefacts of the new approach whilst management protects the invisible assumptions the new approach directly threatens.’
The Agile Illusion
Agile was not primarily a project management approach. It was a philosophical claim: that software is best built by small, trusted, self-organising teams who can respond to change, and that the job of management is to protect those teams from bureaucracy and give them the conditions to think clearly. The Agile Manifesto (Beck et al., 2001) said this plainly. Almost nobody who adopted Agile read it that way.
What most organisations actually did was translate Agile into a management reporting framework. Two-week sprints became two-week checkpoints. The product backlog became a demand queue. The Scrum Master became a junior project manager with a new job title. The daily standup became the most tedious meeting in the building. Velocity — a team’s internal sense of its own pace — became a metric executives tracked on dashboards and used in individuals’ performance reviews.
The core assumption that Agile required management to update was this: that developers are knowledge workers whose judgement should be trusted, not inputs to be measured and directed. Most organisations never accepted that. They kept the ceremonies and discarded the philosophy. The result was monstrosities like SAFe — Scaled Agile Framework — a methodology so baroque that it required its own certifications and effectively recreated the waterfall programme management it was supposed to replace. Consultants got rich. Software quality did not improve.
The AI Performance
The AI adoption story is following the same script with remarkable fidelity, just faster.
Organisations have purchased enterprise licences for AI tools. They have appointed Chief AI Officers. They have run lunch-and-learns and internal hackathons. They have issued press releases about their AI transformation. And in a very large number of cases, the actual use of AI has remained confined to a small number of enthusiastic individuals who figured it out themselves, whilst the broader organisation waits for governance frameworks, approved use-case lists, and legal clearance that never quite arrives.
The reason is the same as with Agile. AI, properly used, requires management to accept that workers can think alongside a tool that has more information than their managers, can iterate faster than any approval process allows, and can make decisions without running them up the hierarchy. It requires accepting that value comes from experimentation and judgement, not compliance and predictability. Most management structures are not built on those assumptions and have no intention of becoming so.
So instead, AI gets adopted in the one place where it doesn’t threaten anyone: the slide deck. It gets added to PowerPoint. It generates first drafts that a committee then rewrites into the same language that would have been produced without it. It summarises documents that perhaps should not have been written. The technology performs transformation. The organisation performs having adopted it.
This Is Not New
The phenomenon has a long and well-documented history. Every generation produces a new methodology or technology powerful enough to require genuine belief change, and every generation of management finds a way to adopt its appearance without its substance.
1970s–80s — Lean Manufacturing Toyota’s system was copied everywhere. It worked nowhere outside Japan. Western executives took the kanban boards and missed the point: that frontline workers are the primary source of process intelligence. Womack, Jones, and Roos (1990) documented this transfer failure extensively, noting that the physical tools of the Toyota Production System were faithfully replicated whilst the underlying philosophy of worker trust and continuous improvement was quietly set aside. That belief was non-negotiable for Toyota. So was rejecting it for its imitators.
1980s–90s — Total Quality Management Deming’s (1986) 14 Points required management to eliminate fear, destroy internal competition, and abandon numerical targets. American firms kept all three and launched quality departments. Deming called this ‘the same old stuff’. He was right.
1990s–2000s — ERP Implementation Failure rates for SAP and Oracle deployments routinely exceeded 50 per cent. Davenport (1998) identified the root cause clearly: executives assumed the software would conform to existing processes, when in fact the software encoded better ones. Adapting to it meant admitting that current practice was suboptimal. Nobody wanted to hear it.
1990s–2000s — Newspapers and the Internet Most major papers had the brand, the audience, and the early web presence to dominate digital news. They failed because they believed they were in the newspaper business. Christensen’s (1997) framework for disruptive innovation anticipated this precisely: incumbents fail not because they lack capability, but because their assumptions about what business they are in prevent them from deploying it. The assumption protected print revenue until there was no print revenue left.
2010s — Digital Transformation Enterprises hired CDOs, opened innovation labs, and ran hackathons — whilst leaving intact the annual budgeting cycles, legal approval chains, and risk aversion that made them slow. The labs produced pilots. The pilots died. Kotter (1996) argued that transformation fails when the urgency is performed rather than felt; the innovation lab is perhaps the clearest institutional expression of performed urgency ever devised.
Ongoing — Evidence-Based Medicine Research consistently shows a gap of 17 years or more between robust clinical evidence and standard practice (Morris, Wooding, & Grant, 2011). The mechanism is not ignorance. It is that updating protocols implicitly admits prior practice was suboptimal — threatening status, identity, and liability. The assumption ‘we are already practising good medicine’ actively blocks the adoption of better medicine.
The Cargo Cult Pattern
Anthropologists coined the term ‘cargo cult’ to describe communities in the Pacific who, after contact with American military logistics in the Second World War, built wooden airstrips and control towers in hopes of summoning supply planes. The practice replicated the visible form of something that worked whilst missing the underlying system entirely.
Every failed methodology adoption is a management cargo cult. The organisation builds the airstrip. It holds the ceremonies. It names people after the roles. It waits for the planes. The planes don’t come, because the planes were never the point.
What makes Agile and AI particularly prone to this failure is that both, at their core, ask management to do the one thing most managers have built careers specifically to avoid: cede control. Agile asks managers to trust teams to self-organise and make product decisions. AI asks managers to accept that value creation will be distributed, emergent, and not amenable to the standard instruments of oversight and measurement. Both require a fundamental belief change about where intelligence in an organisation lives.
‘The ceremony is adopted. The belief is not. And so the cargo cult performs its rituals and wonders why the planes never land.’
What Would Genuine Adoption Look Like?
It would look uncomfortable. It would involve managers giving teams real authority over their tools and methods. It would involve abandoning approval processes that exist to protect hierarchy rather than reduce risk. It would involve accepting that the relevant unit of measurement is not activity or compliance, but needs met— and that outcomes in complex environments are not predictable in advance.
It would also involve a specific kind of intellectual honesty that organisations rarely reward: the willingness of management to say, publicly, the way we have been operating was based on assumptions that are no longer correct. That is the sentence that never gets said. It is also the sentence that everything depends on.
The Diagnosis
The failure mode is not technical. It is not a lack of training, budget, or talent. It is a specific act of organisational self-protection: the decision, usually unspoken and often unconscious, to preserve the beliefs that justify existing power structures against the pressure of a new methodology that would redistribute that power.
The technology changes every decade. The pattern does not. Until organisations find a way to make belief and assumption changes survivable for the people at the top — or until those people are replaced by others who already hold different beliefs — the graveyard will keep filling. See also: #NoManagement
Further Reading
Beck, K., Beedle, M., van Bennekum, A., Cockburn, A., Cunningham, W., Fowler, M., Grenning, J., Highsmith, J., Hunt, A., Jeffries, R., Kern, J., Marick, B., Martin, R. C., Mellor, S., Schwaber, K., Sutherland, J., & Thomas, D. (2001). Manifesto for agile software development. https://agilemanifesto.org
Christensen, C. M. (1997). The innovator’s dilemma: When new technologies cause great firms to fail. Harvard Business School Press.
Davenport, T. H. (1998). Putting the enterprise into the enterprise system. Harvard Business Review, 76(4), 121–131. https://hbr.org/1998/07/putting-the-enterprise-into-the-enterprise-system
Deming, W. E. (1986). Out of the crisis. MIT Press.
Kotter, J. P. (1996). Leading change. Harvard Business School Press.
Kuhn, T. S. (1962). The structure of scientific revolutions. University of Chicago Press.
Liker, J. K. (2004). The Toyota way: 14 management principles from the world’s greatest manufacturer. McGraw-Hill.
Morris, Z. S., Wooding, S., & Grant, J. (2011). The answer is 17 years, what is the question: Understanding time lags in translational research. Journal of the Royal Society of Medicine, 104(12), 510–520. https://doi.org/10.1258/jrsm.2011.110180
Womack, J. P., Jones, D. T., & Roos, D. (1990). The machine that changed the world. Rawson Associates.



