Open Source Trained AI. AI Forgot to Pay
And the bill is coming due…
Adam Wathan built Tailwind CSS. The most popular CSS framework in the world. 75 million downloads per month. When you ask Claude to build a landing page, it reaches for Tailwind without you asking.
Last week, he laid off 75% of his engineering team.
Revenue down 80%. Doc traffic down 40%. The framework is thriving. The business is dying.
Someone submitted a PR to add LLM-optimized docs. Wathan had to decline. Optimizing for agents accelerates his company’s death. He’s being asked to build the infrastructure for his own obsolescence.
This isn’t a Tailwind problem. This is a structural break.
The Broken Part
Open source had an implicit contract.
You build something useful. You give it away. In return, you get attention — people visit your docs, learn your name, discover your premium tier, maybe hire you.
Attention was the currency. Human eyeballs on your work.
AI broke that.
Now the models scrape your docs, absorb your code, learn your patterns — then redistribute that knowledge directly to users. The user gets Tailwind without ever visiting tailwindcss.com. They don’t see the premium tier. They don’t know Adam Wathan exists.
The knowledge still transfers. The cash back is not.
AI Is Good Because Humans Were Good
Here’s what people miss in the hype.
AI didn’t invent Tailwind. Adam Wathan did. Years of design decisions. Clear documentation. Intuitive defaults. Community feedback. Iteration after iteration.
AI didn’t write Stack Overflow answers. Millions of developers did. Taking time. Explaining concepts. Debugging strangers’ code for free.
AI codes well because of decades of human well written craft, tutorials and libraries.
AI is a redistribution engine. It didn’t create the value. It moves it — from creators to platforms, from maintainers to shareholders, from open source to closed products.
The Retreat Has Started
I’m not the only one seeing this.
Marc Schmidt — maintainer of libraries with over a million downloads per month — announced this week he’s going closed source:
“OSS monetization was always about attention. Human eyeballs on your docs, brand, expertise. That attention has literally moved into attention layers. Your docs trained the models that now make visiting you unnecessary.”
His solution: gate access directly. No pay, no code.
It’s rational. When the downstream funnel is broken, you move payment to the only point that works: the gate.
But here’s my concern with everyone going closed source: it destroys the commons.
Open source worked because knowledge compounded. You built on my library. I built on yours. The ecosystem grew faster than any company could alone.
If every maintainer retreats behind a paywall, we lose that compounding. Fragmented landscape and proprietary micro-dependencies will popup everywhere. Plus the AI will not train on it anymore making it less efficient and less used
We Need New Plumbing
The music industry hit this wall twenty years ago.
Napster, then Spotify, disintermediated distribution. Artists saw reach explode and revenue collapse. Same dynamic.
They eventually built infrastructure: ASCAP, BMI, SoundExchange. Streaming pays royalties per play. Not perfect and artists still complain about rates — but there’s a mechanism. Value flows back.
Open source has no equivalent. No ASCAP for code. No per-token royalty. No revenue share when Claude writes Tailwind.
I have this idea to apply similar pattern to AI but the implementation is hard.
How do you attribute token usage to specific libraries? How do you meter knowledge baked into model weights? Who pays — AI companies, enterprises, developers?
Those are hard real problems and the time will tell the possible answers.
But the conversation needs to start now. Before AI writes 99% of code and there’s no one left to maintain the libraries it depends on.
Directions Worth Exploring
AI-specific licensing. New licenses that permit usage but require revenue share when AI generates code commercially. Think of it like a streaming royalty — you use the library through an agent, a fraction of a cent flows back.
Attribution infrastructure. Tools that surface which libraries AI relied on for a given output. Right now it’s a black box. It doesn’t have to be.
AI company contributions. Voluntary or mandated funding from model providers back to the open source ecosystem. Google just became a Tailwind sponsor after the layoff news. That’s a start — but it’s charity, not a system.
Dual licensing for AI. Free for humans. Paid for AI-mediated commercial use. The library stays open, but the business model adapts.
None of these are simple. All of them are better than watching the ecosystem collapse.
The 10% That Matters
My framework for AI development: AI does 90% of the work. The last 10% — specifications, system design, guardrails — is where projects succeed or fail.
Same applies to the ecosystem.
AI generates 90% of the code. But the 10% that makes AI good — the frameworks, the documentation, the accumulated craft of open source — comes from humans.
If we don’t sustain that 10%, the 90% degrades. Models trained on AI-generated slop. Hallucinated docs. Abandoned libraries. The foundation rots.
Open source isn’t optional infrastructure. It’s the substrate AI stands on.
We need to start treating it that way.
AI writes the code. Humans designed the system that made it possible.
Time to design a system that keeps it possible.
This is the kind of structural thinking I live to dig into.
The boring parts that actually matter — specifications, guardrails, system design.
If you’re building with AI agents and want to understand the 10% that determines success, subscribe to Dev3o.

