The Undoing Project cover

The Undoing Project

by Michael Lewis

The Undoing Project delves into the groundbreaking partnership of Daniel Kahneman and Amos Tversky. Their work on decision-making and cognitive biases transforms our understanding of human behavior, revealing that our choices are often swayed by unseen factors.

How We Misjudge, Predict, and Decide

Why do smart people make predictable mistakes? The Undoing Project explores this question through the intertwined stories of Daniel Kahneman and Amos Tversky—two Israeli psychologists whose collaboration uncovered the system behind human error. Through their work, you learn how your mind replaces hard math with intuitive shortcuts, how stories overpower statistics, and how doubt and rigor can rescue judgment from its natural biases. It’s also a book about how friendship and intellect can shape entire fields—from psychology to economics, from basketball strategy to public policy.

The human mind as a prediction machine

At its core, this book argues that knowledge equals prediction: the better you can forecast outcomes, the more you actually understand. Daryl Morey, an NBA executive inspired by Bill James and data analytics, embodies this idea in basketball. He built models to forecast player performance, treated roster building like a scientific experiment, and constantly tested hypotheses on the court. His journey parallels Kahneman and Tversky’s psychological revolution: turning intuition into data, judgment into prediction, and error into insight.

Kahneman’s doubt and Tversky’s features of thought

Kahneman’s childhood in wartime Europe taught him to distrust easy answers. That skepticism shaped his later method: measurable, structured, humble before evidence. In contrast, Tversky was bold and theoretically brilliant—he saw patterns in human judgment the way a physicist sees symmetry. His work on feature-based similarity revealed that how you frame comparisons determines what you find similar or important. Together, their temperaments—Danny’s doubt and Amos’s confidence—became a perfect engine for exploring how minds work when the world is uncertain.

From intuition to experiment

Kahneman and Tversky did something rare: they turned everyday thinking into measurable research. Instead of studying exotic disorders, they examined normal reasoning—why gamblers see streaks in random events, why doctors misdiagnose the obvious, why people bet against probability. Their lab wasn’t full of rats and levers; it was built from short surveys, coin flips, and verbal puzzles. These “quick-fire” experiments revealed the hidden algorithms in your intuition—the heuristics that make your life easier but your reasoning fragile.

The discovery of heuristics and biases

Through systematic testing, they uncovered how people substitute rough rules for rational calculation. When you guess likelihoods, you use heuristics: representativeness (judging by resemblance), availability (judging by what comes to mind), and anchoring (starting from an arbitrary number). These shortcuts explain classic fallacies—like why you think a well-described person is probably a computer scientist even if the base rate is tiny, or why you fear flying after seeing a plane crash on the news. The mind confuses vividness for truth and similarity for probability.

From the lab to the world

Their insights radiated outward: into economics, where Richard Thaler used them to build behavioral economics and explain market quirks; into medicine, where doctors redesigned decision protocols; and into corporate and government policy, where Cass Sunstein and others turned behavioral science into choice architecture. The same biases that blind doctors or investors also shape voters and pilots. The remedy, Kahneman and Tversky showed, is not to train perfect rationalists but to design systems where bias matters less.

Why this story matters

The book’s emotional core is not just the science but the partnership. Danny and Amos loved arguing, laughing, and stripping illusions bare. When they quarreled, their collaboration collapsed, and with it, a golden age of ideas. Yet their discoveries endured because they captured something timeless: how minds make sense of chaos. From predicting basketball talent to saving medical patients to crafting policy, their work changed how you can think, decide, and design your environment to reason better. (Note: The book’s title, The Undoing Project, comes from Kahneman’s later study of counterfactual thinking—the mental habit of imagining what might have been, another example of the mind’s flawed yet revealing patterns.)

Core message

Human judgment can be disciplined but never perfect. If you build structures—models, checklists, and data systems—that constrain your biases and update with evidence, you transform intuition into knowledge.

In short, The Undoing Project is about how science, friendship, and error converge to reveal what it means to think. It teaches you that your brain’s shortcuts are not failings but clues—and that understanding them is the first step to making better predictions and wiser decisions.


Building Better Judgment

If you run a company, manage people, or simply make high-stakes choices, Daryl Morey’s story offers a tangible blueprint for what Kahneman and Tversky theorized. Morey runs basketball like a forecasting lab: he builds models, measures relentlessly, and treats every draft pick as an experiment. His philosophy—“knowledge is prediction”—translates cognitive science into practice.

Model first, judgment second

Morey began by questioning what data mattered. Instead of relying on raw stats like points per game, he built per-minute and context-adjusted metrics—rebounds per opportunity, pace-adjusted shooting, and measurements like wingspan or first-step quickness. These adjustments made signals cleaner. When the model picked undervalued players (like Carl Landry or Aaron Brooks), it proved its worth; but when it overvalued others (like Joey Dorsey), it revealed exactly which variables needed reweighting. Predictive power became both scorecard and teacher.

Why experts go wrong

Scouts, like doctors or investors, fall prey to biases: confirmation bias, the halo effect, and the endowment effect. Confirmation bias makes you seek evidence that flatters your first impression; halo effects turn one visible feature—height, charm, beauty—into a global judgment; and the endowment effect makes you overvalue what you already own. Morey built guardrails: banning nicknames, forcing scouts to write valuations before discussions, and weighting model data more heavily than human impressions. These tweaks transform intuition from master to servant.

Behavioral countermeasures

To offset bias, you don’t need genius—you need structure. Morey made scouts justify claims numerically, compared models and humans on historical predictions, and iteratively retrained both. The principle extends anywhere: use checklists in hiring; document decisions before negotiation to avoid the endowment trap; collect missing data instead of relying on gut feel. Over time, the model–human system becomes a dialogue that improves itself.

From sports to science

Morey’s franchise became an applied psychology lab, echoing Kahneman’s Israeli army experiment that replaced unstructured interviews with scored ratings. Both stories show that disciplined structure—not brilliance—yields more accurate decisions. Whether you’re evaluating basketball players, soldiers, or job candidates, the path forward is the same: define key variables, collect consistent data, and let evidence constrain your biases. The lesson is humbling but liberating: rationality is an organizational achievement, not an individual trait.

Practical rule

Start with a simple model; add human judgment only to fill data gaps. Then test, measure, and update. Knowledge lives where prediction improves.

Morey, like Kahneman and Tversky, shows that every prediction discipline—whether sports analytics, investing, or hiring—relies on the same human truth: intuition must be checked by evidence, and evidence must be organized to defeat bias.


The Science of Bias

Daniel Kahneman and Amos Tversky built a science around predictable error. They discovered that ordinary people, including experts, systematically deviate from logical or statistical reasoning in ways that follow consistent patterns. This recognition—that bias has structure—turned psychology into a tool for decision-making and reshaped disciplines from medicine to economics.

Representativeness

When you estimate probability, you often substitute resemblance for reality. A kind, bookish person “seems like” a librarian, so you assume that’s likely—ignoring that librarians are rare. That’s representativeness. In the NBA, a player who looks like a star may get drafted sooner than one whose data predict future success. This heuristic is quick but error-prone because it discards base rates.

Availability

You judge frequency by what comes easily to mind. After hearing about shark attacks, you overestimate their likelihood. Kahneman and Tversky demonstrated this with simple word tests—people think “K” as the first letter is more common than “K” as the third because examples come faster. What is vivid feels true. The human memory’s retrieval ease masquerades as data accuracy.

Anchoring

Even arbitrary numbers bias you. In one experiment, spinning a random wheel altered subjects’ estimates of African countries in the UN—the number acted as an unconscious anchor. Whether pricing goods, predicting revenue, or setting salaries, initial figures stick stubbornly. You adjust too little, creating systematic distortions.

From small numbers to big errors

People also believe in a “law of small numbers”—they expect small samples to reflect the whole population. Investors treat a short winning streak as proof of talent. Doctors misjudge a few cases as reliable evidence. Kahneman and Tversky showed that even trained statisticians fall for this tendency, highlighting how randomness violates intuitive pattern-seeking minds.

What this means for you

Your mind is a story engine. It prefers coherence to probability. Learning to spot the footprints of heuristics—where judgment feels easy—is the first defense against predictable error.

Through these insights, Kahneman and Tversky didn’t make human reasoning look hopeless—they made it measurable and, therefore, improvable. By seeing bias as pattern, you gain the power to design habits and systems that correct it.


Risk, Framing, and Prospect Theory

Traditional economics assumed that you, like a computer, weigh outcomes by their expected value. Prospect Theory shattered that illusion by showing that your mind evaluates gains and losses relative to a reference point—and that losses hurt more than equivalent gains please. This insight replaced the tidy math of utility theory with the messy psychology of human feeling.

Loss aversion and reference points

Kahneman and Tversky found that you’re loss-averse: losing $100 feels about twice as painful as gaining $100 feels good. You evaluate outcomes relative to what you expect or believe you deserve, not absolute wealth. That’s why a $500,000 bonus disappoints someone expecting a million—it sits in the mental realm of “loss.”

Probability weighting

You exaggerate tiny probabilities and underweight large ones. That’s why you buy both lottery tickets and insurance. The math doesn’t differ; your mind just distorts risk perception. In daily life, that means you overprepare for rare disasters but ignore common drags like slow career decay.

Framing and the isolation effect

The way information is framed determines your choice. In their “Asian Disease Problem,” identical options save or lose lives depending solely on wording. When options are framed as gains, people prefer certainty; framed as losses, they gamble to avoid sure defeat. The conclusion is profound: you don’t choose between things—you choose between descriptions of things.

Implication

To influence decisions—your own or others’—redesign reference points and frames, not just incentives. People follow how choices feel, not how they calculate.

Prospect Theory explains why markets swing, why voters react to losses more than progress, and why personal happiness depends on expectations. It replaces the ideal of rational man with a more accurate, emotional model of human behavior.


The Power of Story

Humans crave narrative coherence, even when it violates logic. The Linda experiment—asking whether a bright, feminist-leaning woman is more likely to be (1) a bank teller, or (2) a bank teller active in the feminist movement—revealed how stories overpower mathematics. Most people pick the conjunction, which is statistically impossible to be more probable than its parts. The story feels true; the math feels cold.

Why your mind loves coherence

You favor explanations that fit your stereotypes and emotional expectations. Stories with rich details “make sense,” while bare facts feel incomplete. This narrative instinct aids understanding but kills accuracy. In medicine, law, and investing, vivid anecdotes routinely outweigh base rates.

How to resist it

The antidote is to translate stories into counts: ask “how many out of 100 cases?” This reframing restores statistical reasoning. As researchers like Gerd Gigerenzer showed, frequency formats help people overcome conjunction errors. The moral: shift your representation, not your personality, to think clearly.

Short rule

If a story fits too perfectly, pause. Coherence is not proof. Separate how well it reads from how likely it is.

The conjunction fallacy and similar narrative biases teach a crucial lesson: emotional plausibility and logical probability are often at odds. Recognizing that dissonance is how you protect yourself from persuasive but false coherence.


Undoing and Remembering

As Kahneman explored regret and memory, he realized that your mind not only predicts—it rewrites. The simulation or undoing heuristic explains why you torment yourself with “if only” and why near misses haunt you. You can’t help generating counterfactuals, but they follow strict psychological rules.

Rules of undoing

You change what’s easiest to imagine: rare events, recent deviations, or individual actions. This is why a banker who dies on a detour triggers more “if-only” than one who dies on his usual route. You fix the scene and alter the actor—you replicate patterns that minimize change. Kahneman called these mental preferences “downhill” and “focus” rules.

Regret, memory, and the peak–end rule

Your memories don’t record the flow of time; they compress it to moments that stand out—the peak and the end. Kahneman and Don Redelmeier’s experiments with painful medical procedures proved that longer but slowly improving experiences feel better in retrospect than shorter, abruptly ending ones. You remember trajectories and endings, not totals.

Emotional reasoning

Undoing and remembering reveal decision biases rooted in feeling, not logic. They explain why you obsess over near successes and misattribute causes to emotion-laden moments. Understanding these patterns doesn’t erase regret—it prevents you from drawing false lessons from it.

Practical reflection

Your memory is not a ledger but a storyteller. If you redesign experiences to end better, you change how they’re remembered—and thus, how you decide next time.

In showing how you mentally rewrite the past, Kahneman added an emotional dimension to his earlier cognitive work: you predict not just outcomes, but feelings. That understanding deepens both self-awareness and empathy.


Designing Rational Systems

The final chapters of the book demonstrate how behavioral insights become systems. Whether it’s Don Redelmeier saving lives by questioning diagnostic stories, Cass Sunstein redesigning government defaults, or Daryl Morey refining models, the lesson is consistent: you cannot change human nature, but you can change the environment in which it operates.

From psychology to policy

Richard Thaler converted prospect theory into behavioral economics, exposing how loss aversion and the endowment effect shape markets. Cass Sunstein translated these findings into “choice architecture”—the design of defaults that guide behavior. Automatic benefits enrollment, organ donation opt-outs, and simple health decision forms all trace to Kahneman and Tversky’s insights about human inertia and framing.

System design beats self-control

Amos Tversky liked to say you can’t de-bias a mind, but you can de-bias a context. Delta Air Lines reduced flight errors by flattening cockpit hierarchies so junior pilots could question captains—an applied form of bias-proof design. Medicine found similar gains: structured checklists, second opinions, and diagnostic pauses improved outcomes more than “try harder” slogans ever did.

The enduring impact

Through behavioral policy, psychology now informs economics, public health, and business strategy. The line runs directly from a conversation in a Jerusalem cafeteria to modern “nudge” units in governments. The genius was not discovering irrationality but learning how to build around it.

Systemic moral

Fix environments, not minds. Structure decisions so doing the rational thing is easier than doing the biased one.

That idea—engineering settings for predictably imperfect people—is the lasting gift of Kahneman and Tversky’s work. It moves psychology from diagnosis to design, from understanding error to preventing it.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.