You’re About to Make a Terrible Mistake! cover

You’re About to Make a Terrible Mistake!

by Olivier Sibony

In ''You''re About to Make a Terrible Mistake!'', Olivier Sibony reveals how cognitive biases distort high-stakes decision-making. Learn practical techniques to identify and counteract these biases, leading to more rational business strategies and improved outcomes.

Why Smart Leaders Make Bad Decisions

Why do intelligent, experienced leaders—those surrounded by smart teams and armed with sophisticated data—still make terrible strategic decisions? In You're About to Make a Terrible Mistake!, Olivier Sibony explores this mystery and reveals an uncomfortable truth: the problem isn’t intelligence, effort, or ethics—it’s that even great leaders are fundamentally human. They fall prey to cognitive biases, the predictable mental shortcuts that distort reasoning and lead to bad corporate choices.

Sibony, drawing on his decades as a McKinsey partner and on behavioral science research pioneered by Daniel Kahneman, argues that the key to better leadership decisions lies not in becoming more rational as individuals but in redesigning how organizations decide. He encourages leaders to see themselves not as supreme deciders but as decision architects—designers of the processes, teams, and conversations that produce smarter collective choices. The book’s core insight echoes Kahneman’s warning that we can’t eliminate our biases, but we can create systems that counteract them.

The Behavioral Foundations of Bad Decisions

At the heart of Sibony’s argument is behavioral economics’ sobering discovery: humans are predictably irrational. Under the influence of biases like overconfidence, confirmation bias, and groupthink, skilled professionals routinely make choices that contradict logic and evidence. These biases don’t stem from stupidity, Sibony explains, but from mental heuristics—energy-saving shortcuts that serve us in daily life but misfire in complex, uncertain contexts such as strategic decision-making.

A CEO may invest in a doomed merger not from greed, but from confirmation bias, eagerly searching for data that supports a deal she already believes in. A board might silence dissenting voices due to groupthink. An organization could cling to an outdated strategy out of status quo bias. What makes these errors dangerous, Sibony argues, is their predictability: they emerge systematically across different firms, industries, and leaders.

From Bias to Systematic Failure

The book opens with a parable familiar to consultants: a brilliant CEO ignores analysts’ warnings and makes a dubious acquisition, justifying the price by assuming future currency appreciation. Against the odds, the bet pays off. Years later, analysts label him a visionary. But Sibony reminds us that such successes often depend on luck, not superior reasoning. The same decision process—rooted in bias and overconfidence—could just as easily have led to catastrophe. In business, as in poker, good outcomes don’t always mean good decisions.

By examining classic corporate blunders—from J.C. Penney’s ill-fated reinvention to Polaroid’s digital downfall—Sibony identifies nine recurring “traps” in decision-making. These aren’t random; they arise from specific cognitive and social biases that push even well-intentioned organizations off course. The overarching theme: bad decisions are collective, patterned, and preventable—but only if organizations rethink how they decide.

Deciding How to Decide

If human judgment is flawed, what’s the alternative? Sibony rejects the naïve idea of eliminating biases through self-discipline or “thinking harder.” Instead, he advocates for organizational debiasing. The key isn’t to fix the individual mind—it’s to design environments and processes that constrain bias and enable better collective reasoning. Drawing parallels with fields like aviation and medicine, where checklists and cross-verification save lives, Sibony shows how collaboration and process can “de-bias” corporate strategy.

This concept culminates in the idea of decision architecture. Great leaders, Sibony writes, act as architects of decision systems, not merely heroic solitary deciders. They ensure open dialogue, invite dissent, design meetings that emphasize structured debate over presentation, and cultivate a culture where diverse viewpoints collide productively. In doing so, they replace charisma and instinct with rigor and humility.

Why These Ideas Matter

In an era where AI and algorithms promise supposedly “objective” decisions, Sibony reminds us of a deeper truth: human judgment remains essential but must be intelligently designed around its weaknesses. Whether you’re a CEO charting strategy, an entrepreneur facing a risky launch, or a policymaker making high-stakes calls, understanding and mitigating bias is a competitive advantage. As behavioral economist Richard Thaler (author of Nudge) argues, the key isn’t removing humanity from decisions but structuring environments so our humanity doesn’t get in the way. Sibony’s book provides a blueprint for doing just that—teaching you not only to decide better, but to decide how to decide.


The Storytelling Trap

We all love a good story—but stories, Sibony warns, are dangerous when they substitute for analysis. In business as in politics, narratives seduce us into false certainty. The first of the nine traps, the Storytelling Trap, shows how confirmation bias and our hunger for coherent tales lead leaders to embrace explanations that “feel” true, even when the facts don’t hold up.

When Belief Feels Like Truth

Sibony opens with two astonishing real-world scams: a 1970s French “oil-sniffing airplane” hoax and a near-identical Silicon Valley version thirty years later. In both cases, seasoned executives and investors poured millions into technology that supposedly detected oil from the sky—without drilling. Scientists, CEOs, and even governments fell for the con because the story delighted them: visionary inventors, national pride, and technological salvation. They didn’t want to check the numbers; they wanted to believe.

The same logic doomed J.C. Penney’s 2011 reinvention under Apple executive Ron Johnson. His narrative—make the aging retailer chic again through sleek design and premium brands—was irresistible. Yet he ignored evidence that Penney’s existing customers liked coupons, not couture. By telling a compelling story about transformation, Johnson and his board silenced doubt. The facts were true, but the story was false.

Why Stories Trump Facts

According to Sibony, confirmation bias lies at the core: once we decide what we want to believe, our minds automatically search for confirming evidence while ignoring contradictions. Worse, champion bias makes us trust persuasive messengers over messy data. A deal can sound smarter when pitched by a “visionary” CEO than when delivered by a cautious analyst. And experience bias reinforces this pattern—leaders trust their instincts from past successes, even when those experiences are irrelevant to the current challenge.

Even rigorous disciplines fall prey. Sibony cites cognitive scientist Itiel Dror’s studies showing that professional fingerprint examiners changed their conclusions when told whether a suspect had confessed. Facts remain constant, but stories around them rewire judgment—a lesson every leader should heed before interpreting market data or forecasts.

Escaping the Story

To resist the Storytelling Trap, Sibony suggests cultivating story-checking rather than fact-checking alone. Ask: “What’s another story that could explain these same facts?” Encourage teams to generate competing narratives—not to choose one instantly, but to make bias visible. As Nassim Taleb noted in The Black Swan, the mind is an “explanation machine” that craves order over truth. Recognizing this allows you to spot when a neat story has replaced a messy but accurate reality. In short, a great leader listens for stories but makes decisions on structure and evidence, not seduction.


The Imitation Trap

If storytelling deludes us internally, idolizing others blinds us externally. In “The Imitation Trap,” Sibony explores our obsession with business heroes—Steve Jobs, Jack Welch, Elon Musk—and how the halo surrounding their success misguides imitators. The result: companies mimic strategies that worked somewhere else without realizing those strategies succeeded due to context, not genius.

Three Cognitive Errors of Hero Worship

Attribution error makes us credit one individual—say, Jobs—for collective achievements that actually rely on whole organizations. The halo effect leads us to assume that successful leaders’ habits (their attire, routines, slogans) caused their success, rather than merely accompanying it. Finally, survivorship bias tricks us into studying only winners while ignoring the countless similar failures we never see.

Consider Ron Johnson again. His reputation from Apple—a paragon of design—made him appear infallible. But Apple’s success owed as much to irresistible products (iPhones, iPads) as to store design. Transplanted to a ailing mid-range retailer, his methods became disasters. Sibony notes that imitation eliminates differentiation: copying rivals’ “best practices” produces sameness, not strategy. (As Michael Porter argued in “What Is Strategy?”, operational effectiveness is not the same as strategic advantage.)

Beyond “Best Practices”

Sibony encourages you to analyze why something works before copying it. Many firms imitated GE’s ruthless “forced ranking” system because it appeared to create excellence—until they discovered it destroyed morale. Even GE later abandoned it. The lesson: instead of cloning “great leaders,” study decisions contextually. Ask which factors truly explain success—and which ones are noise.

Learning from Failures

Sibony ends with a subtle reversal: instead of worshiping success, study failure. “Everyone agrees we learn from our mistakes,” he writes, “but we still idolize winners.” Great decision-makers, like disciplined scientists, seek disconfirming evidence, not role models. The discipline to avoid easy imitation is, ironically, the first real step toward originality.


The Overconfidence Trap

Of all biases, overconfidence may be the most pervasive. We overestimate what we know, underestimate uncertainty, and exaggerate control over outcomes. In “The Overconfidence Trap,” Sibony dissects how self-assurance—so prized in leaders—can doom decisions. Through stories like Blockbuster’s fatal arrogance toward Netflix, he demonstrates that optimism drives both innovation and ruin.

Three Faces of Overconfidence

Sibony distinguishes between overplacement (believing we’re better than others), overprecision (believing our forecasts are more accurate than they are), and optimistic planning (underestimating costs and timelines). These distortions create what Kahneman and Tversky called the “planning fallacy.” From the Sydney Opera House to the F-35 fighter jet, massive overruns illustrate how optimism masquerades as ambition.

Timid Choices, Bold Forecasts

Sibony introduces an irony: corporate leaders often make “timid choices and bold forecasts.” They shy away from small risks yet green-light massive, over-optimistic projects. This contradiction stems from organizational pressure to look decisive. Admitting uncertainty sounds weak, while precise numbers project mastery. But as Sibony quips, “No spreadsheet ever called a meeting.” Swapping realism for confidence aids careers—but not decisions.

Optimism as Evolutionary Fuel

Intriguingly, Sibony doesn’t vilify optimism. Evolution, he notes, selected for hope: risk-takers succeed more spectacularly than cautious peers (and survive in leadership positions). Yet optimism is useful only when tied to what you can control. Be ambitious about execution, he urges, but humble about prediction. The best CEOs, like Warren Buffett, distinguish between what’s controllable and what’s not—and bet only on the former.


The Inertia Trap

Why do companies fail to change even when danger is obvious? In “The Inertia Trap,” Sibony shows that organizational paralysis—anchoring on past budgets, fear of losses, and status quo comfort—prevents action. Using Polaroid’s slow collapse and the resource allocation rituals of big corporations, he reveals how bias, not bureaucracy, often chains progress.

Anchoring and Anchored Organizations

Year after year, firms spend 90% of budgets exactly as before—a pattern confirmed by McKinsey data Sibony helped generate. Executives claim to rethink allocations from scratch, but anchoring bias glues them to last year’s numbers. Even judges, experimental studies show, can be influenced by random dice rolls when sentencing criminals. If absurd anchors sway experts, how can managers escape their own historical baselines?

Escalation of Commitment

When decisions go wrong, inertia turns lethal. The sunk-cost fallacy makes leaders double down on failure—continuing wars, hemorrhaging investments, doomed ventures. Sibony cites GM’s $20 billion Saturn experiment, which lost money every year for nearly three decades. Leaders couldn’t abandon it without “admitting failure.” The longer a project lasts, the harder it becomes to quit.

The Status Quo Bias

Even absent prior investment, Sibony argues, we prefer inaction. Experiments show individuals irrationally cling to default choices—whether in pension plans or asset portfolios. In firms, this manifests as “strategic drift”: existing businesses get automatic renewal, while new ones fight uphill to justify funding. Breaking inertia, he concludes, requires deliberate countermeasures—like routine portfolio reviews or “what if we had to start over?” questions that change the default from staying put to acting anew.


Collaboration and Process: The True Decision Advantage

After diagnosing these traps, Sibony turns to the cure. The most striking data in the book comes from a study of over 1,000 corporate investment decisions: when analysts controlled for industry and company differences, they found that how decisions were made explained 53% of success variance, while what decisions were made—the analytical content—explained only 8%. In other words, process matters six times more than ideas.

From Analysis to Architecture

Executives spend endless hours refining spreadsheets and forecasts (“the what”) but neglect discussion design (“the how”). Sibony argues that high-quality collaboration—diverse voices, explicit debate over risks, predefined criteria—produces better outcomes even when knowledge is imperfect. The findings upend corporate tradition: an hour spent improving process beats a week tweaking models.

Decision Architects, Not Geniuses

The ultimate job of leadership, Sibony proposes, is architecture. A “Decision Architect” organizes who speaks, how dissent is encouraged, and when to challenge assumptions. Process doesn’t mean bureaucracy—it means intentional design. As NASA and cockpit procedures show, structured collaboration averts disaster without stifling authority. CEOs who master this craft build what Kahneman calls “noise-reducing systems”—frameworks that make organizations wiser than any one person.

The takeaway: analyze less, discuss more. Your company may already know enough to choose well—but until you redesign how voices are heard and conflicts managed, even the best data will mislead you.


Dialogue, Divergence, and Decision Architecture

In the final section, Sibony transforms behavioral theory into practice through three pillars of Decision Architecture: Dialogue (how perspectives meet), Divergence (how alternatives emerge), and Dynamics (how culture sustains both). Each pillar reverses common corporate habits—replacing presentation with conversation, conformity with experimentation, and rigidity with agility.

1. Dialogue: Creating Constructive Conflict

Most meetings reward agreement. In Sibony’s model, real dialogue requires tension: structured dissent, devil’s advocates, or “Six Amigos” committees that review decisions from fresh perspectives. Limiting PowerPoint, pre-reading memos, and scheduling explicit “for discussion” vs. “for decision” sessions transform politeness into productive contradiction. As Eric Schmidt once said of Google meetings, “discord plus deadline” makes debate sharp but results decisive.

2. Divergence: Seeing Things Differently

To escape groupthink, Sibony promotes “planned disagreement.” Use red teams, external challengers, or the “premortem”—imagining failure in advance to surface unseen risks. Gather data from outsiders, even if they disrupt comfort zones. He advises organizations to “fight bias with bias”: counter anchoring by re-anchoring from scratch, fight confirmation bias with multiple analogies, or reverse status quo bias by making reexamination the default.

3. Dynamics: Institutionalizing Better Decisions

Even the best techniques die without cultural alignment. Agility arises when leaders encourage experimentation, accept small failures, and share big lessons. A powerful principle is granting the right to fail (but not the right to be careless)—turning postmortems into learning, not blame. Great leaders, Sibony concludes, model humility: they admit mistakes, change their minds when facts change (echoing Keynes), and ‘sleep on it’ before making final calls.

Decision architecture, then, is less about control than about curiosity. By designing teams that argue well and adapt quickly, you can institutionalize wisdom in a world where certainty is impossible. It’s a shift from heroic leadership to collective intelligence—a transformation that turns terrible mistakes into teachable moments.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.