Anish Patel

Staying Wrong

It’s okay to be wrong. It’s not okay to stay wrong.

The distinction matters. Being wrong is inevitable — uncertainty guarantees it. But staying wrong is a choice. It’s the refusal to update when evidence contradicts you, the habit of reinterpreting reality to protect your beliefs, the slow drift from “I might be wrong” to “the world is wrong.”


Why we stay wrong

Cognitive dissonance is the mechanism. When confronted with evidence that challenges our beliefs, we experience discomfort — and we’re wired to resolve that discomfort in the easiest possible way. Changing beliefs is hard. Reinterpreting evidence is easier.

So we explain away. “That was an edge case.” “The data wasn’t representative.” “External factors intervened.” Each individual explanation might be valid. But over time, a pattern emerges: every piece of contradicting evidence gets neutralised, while confirming evidence gets amplified. The belief never updates.

The problem isn’t stupidity. Smart people are often better at staying wrong — they’re better at generating sophisticated reasons why the contradicting evidence doesn’t count.


What aviation figured out

Aviation has a learning culture unmatched in most industries. When a plane crashes, investigators recover the black box and reconstruct what happened. Findings are shared globally. Every airline learns from every failure.

The result is extraordinary: commercial aviation has become staggeringly safe, despite the complexity of the systems involved.

Most organisations work differently. Failures are ambiguous — “the project was delayed,” “we lost the deal,” “results came in below forecast.” Ambiguity allows reinterpretation. Each failure becomes a one-off, a special case, a victim of circumstances. The system never updates.


Red flags vs ambiguous setbacks

The difference is stark failure. A plane crash is a red flag — you can’t pretend the system worked. It forces honesty.

Most business failures are ambiguous setbacks. You can tell yourself a story that protects your model of the world. “We were right, but…” is the signature phrase of staying wrong.

This suggests a design principle: create red flags. Make failure stark enough that reinterpretation becomes difficult.

Track predictions against outcomes. Write down what you expect to happen, and when. Check later whether it did. The prediction log creates a record you can’t argue with.

Set tripwires. “If X drops below Y, we’ll reconsider the strategy.” The tripwire forces a reckoning — no drifting into rationalisation.

Force specificity. Vague predictions can’t be wrong. “Things will probably improve” survives almost any outcome. “We’ll hit 85 units by March” has no escape hatch.


The system and the culture

A good system isn’t enough. Even the best learning systems fail if people don’t feed them honest information. You need structures that capture failure data and cultures that reward honesty over ego protection.

This is harder than installing a new process. It requires leaders who model vulnerability — who say “I was wrong” publicly and visibly. It requires separating the learning question (“what happened and why?”) from the blame question (“whose fault was it?”). It requires treating failures as data rather than threats.

Most organisations claim to want this and don’t have it. The test is simple: when was the last time someone in your organisation was rewarded for admitting a significant mistake?


The personal discipline

The same principles apply to individual thinking.

Keep a prediction log. Not every prediction — that’s unsustainable. But the ones that matter: forecasts you’re acting on, beliefs you’d be uncomfortable updating, predictions where you have genuine confidence. Write them down, check them later.

Assign probabilities. “I’m 70% confident” forces you to admit uncertainty. It also creates calibration data — are your 70% predictions actually right 70% of the time? Most people are overconfident. The log shows you where.

Look for disconfirming evidence. When you believe something strongly, actively seek out the best arguments against it. Not strawmen — genuine, thoughtful opposition. If you can’t articulate why smart people disagree with you, you don’t understand the issue well enough.

Update in public. When you change your mind, say so — and say why. This is uncomfortable. It’s also the discipline that prevents staying wrong from becoming a lifestyle.


Being wrong is the price of operating under uncertainty. Staying wrong is a failure of integrity — the refusal to let reality update your map.

The question isn’t whether you’ll be wrong. It’s whether you’ll notice, and whether you’ll change.


Related: When Numbers Twitch · Hidden Priors

Connects to Library: Black Box Thinking · Process vs Outcome

#prediction