In psychology and cognitive science, cognitive biases are systematic patterns of deviation from norm and/or rationality in judgment.
– Wikipedia on List of cognitive biases
We love to treat biases like bugs in our brains. Little glitches in the wetware that we can patch once we know about them. “Oh, that’s just confirmation bias,” we say, as if naming the demon exorcises it. Spoiler: it doesn’t.
What if biases aren’t bugs at all? What if they’re systems? Complete with inputs, outputs, feedback loops, and a strange kind of logic that makes them incredibly stable. Let me walk you through three of them.
Confirmation Bias: The Spam Filter That Learns Too Well
You have a belief. Let’s say you believe that your team’s new testing approach is solid. Now information comes in. Some confirms your belief, some challenges it. Here’s where the system kicks in: confirming information flows right through. Challenging information? Filtered out. “That’s an edge case.” “They don’t understand our context.”
The feedback loop is elegant in its simplicity. The more you filter out contradicting evidence, the stronger your belief becomes. The stronger your belief, the stricter your filter. It’s self-reinforcing.
And here’s the thing: this isn’t stupid. It’s efficient. Your brain can’t process everything. It needs shortcuts. Plus, your beliefs are tied to your identity. Questioning them hurts. The system protects you from cognitive overload and existential discomfort. It’s doing its job.
But there’s a social layer too. We don’t form beliefs in isolation. We learn what to believe from our tribes. Challenging a belief often means challenging your belonging. The group rewards conformity, punishes doubt. So the social system reinforces the cognitive one. Echo chambers aren’t a bug of social media. They’re the natural output of this system, amplified.
Think of it like an email spam filter that learns from your behavior. Eventually, it gets so “good” that important messages end up in spam too. The filter works. It just doesn’t work for what you actually need.
Sunk Cost Fallacy: The Trap of Commitment
You’ve invested six months into a project. It’s not going well. Every rational indicator says: stop. But you don’t. You invest more. Why?
The system looks like this: you have a stock of investment (time, money, emotion). Abandoning the project means that stock turns into pure loss. And loss hurts. So the system’s output is: keep going. Which increases the stock. Which makes future abandonment even more painful.
This isn’t irrational in every context. Persistence is valuable. Finishing what you started signals reliability to others. In environments where giving up early was often the wrong choice, this bias helped our ancestors survive. The system is adapted to a world that doesn’t always match our spreadsheet logic.
And socially? We’ve built entire value systems around it. “Winners never quit.” “Stay the course.” We admire the founder who mortgaged everything and made it. We don’t tell stories about the ones who wisely walked away. Or about those who lost everything. Quitting is shameful. The social feedback loop punishes rational abandonment and rewards irrational persistence.
It’s like that project at work that everyone knows should be cancelled, but nobody wants to be the one to say it. Too much budget already spent. Too many careers tied to it. The system keeps it alive.
Garbage Piling Up: The Environment That Teaches Or The Broken Window Fallacy
If you were following me on social media for a bit longer, you know that this is close to my heart. It’s this time of year again, where I take a bag and head down to the river, to my “way to work” – it’s a round-trip, because I work from home – and collect garbage, before nature starts blooming. Yesterday’s tour inspired me to write this whole post.
This one operates at a purely social level. Someone leaves a bit of garbage by the river. Sometimes it’s just the wind blowing something down from the federal highway. Nobody removes it. Signal received: this is a place where people dump things. More garbage appears. Then, at certain places, larger items. The feedback loop turns one careless act into a systemic dumping ground.
The components: the physical environment is a stock. People’s behavior is a flow. And the perceived norm, “what’s acceptable here,” is the feedback mechanism that connects them.
This bias is social learning in its purest form. We look to our environment for cues about how to behave. What do others do here? If everyone else follows the rules, I probably should too. If nobody does, why would I be the exception? We’re not making independent moral calculations. We’re reading the room. And acting accordingly. Look at the distribution pattern of cigarette butts. From time to time you will find larger collections.
Speaking of cigarette butts, another epic fail in society, at least in Germany. It’s totally normal to throw cigarette butts on the ground or out of the vehicle window. You rarely get a weird look. It’s accepted behavior. Actually it can be punished, with a fine of up to 150€. But I have never heard of anyone being fined for littering with those pesky little bits of waste, that stay around for up to 10 years and introduce poison and micro particles into the ground and water.
Software testers know this pattern intimately. One “temporary” hack in the codebase. One skipped code review. One test that’s commented out “just for now.” The first piece of garbage. And suddenly the whole codebase feels like a place where standards don’t apply.
The Pattern Behind the Patterns
All three biases share something: reinforcing feedback loops. They’re not momentary lapses in judgment. They’re stable systems that resist change. And they all have a social component that makes them even stickier. We learn what to believe from our groups. We learn what commitment looks like from our culture. We learn how to behave from our environment. The social feedback loops reinforce the individual ones.
That’s why knowing about biases doesn’t make them disappear. You can’t think your way out of a feedback loop. You have to interrupt it.
So the next time you notice a bias in yourself or others, don’t ask “why are they being so irrational?” Ask instead: what system is keeping this alive? Find the loop. Find the leverage point. That’s where change becomes possible.
PS: And before you ask if I have read “Thinking Fast and Slow” by Daniel Kahnemann. Yes, I have. Over 10 years ago. And the colleague that I borrowed it to back then, still has it. Kahnemann probably has better explanations for this than me.