Why We Make Mistakes cover

Why We Make Mistakes

by Joseph T Hallinan

Why We Make Mistakes explores the fascinating reasons behind human error, from cognitive biases to perceptual limitations. Offering a blend of psychology, neuroscience, and practical advice, Joseph T. Hallinan reveals how understanding these errors can help us improve decision-making and avoid common pitfalls.

Why We Make Mistakes and What That Reveals About Us

When was the last time you made a mistake—maybe misplaced your car keys, forgot a password, or made a foolish purchase you instantly regretted? In Why We Make Mistakes, journalist and Pulitzer Prize winner Joseph T. Hallinan argues that such blunders aren’t anomalies to be ashamed of—they’re essential clues to how the human brain works. His central claim is simple but profound: we make mistakes not because we are careless or lazy, but because our minds evolved to interpret the world in shortcuts and patterns that worked brilliantly in some contexts—and fail spectacularly in others.

Hallinan draws from decades of cognitive psychology, neuroscience, and real-world stories—from airline disasters and medical errors to supermarket pricing tricks—to make the case that error is not the exception but the rule of human life. Understanding why we err, he contends, can make us humbler thinkers and smarter decision-makers.

The Pattern-Seeking Brain

At the heart of Hallinan’s thesis is the idea that our brains don’t passively record the world—they actively interpret it. We look for patterns and meaning, even when none exist. This tendency to “connect the dots” helps us make sense of complexity but also leads us astray. When we see a person behaving angrily, we instantly invent a story about their personality, ignoring situational factors. When we see trends in stock prices or sports results, we imagine we’re spotting patterns instead of randomness. Our bias toward meaning-making is one of humanity’s greatest strengths—and its biggest weakness.

Seeing Without Seeing

One of the opening chapters reveals a stunning truth about perception: we see far less than we think we do. Our eyes have clear focus across only about two degrees of our visual field—roughly the width of your thumb at arm’s length—yet we’re convinced we perceive everything vividly. Hallinan recounts experiments by psychologists like Daniel Simons, where people watching a video of basketball players entirely miss a man in a gorilla suit walking through the scene. Even professionals like radiologists and airport security screeners suffer similar “change blindness.” This illusion of completeness in our vision becomes a metaphor for how we overestimate our knowledge across all domains.

Memory, Meaning, and Misremembering

Humans also remember meaning, not details. Hallinan describes psychologist Harry Bahrick’s long-term memory studies showing that former students could recognize high school classmates’ faces after fifty years—but forgot most of their names. Similarly, we can visualize a penny but have trouble recalling the placement of its inscriptions because meaning, not precision, is what sticks. We misremember events and polish them until they fit our self-image. People inflate their high school grades, parents recall being more attentive than they were, and eyewitnesses confidently identify the wrong person—all examples of memory as reconstruction, not retrieval.

The Brain’s Built-In Biases

Hallinan shows that biases are embedded in how we interpret, focus, and decide. We are overconfident (“I can’t forget that password” or “I’ll use this gym membership”), we favor what confirms our expectations, and we mistake correlation for causation. One striking example is the story of anesthesiologists who drastically reduced fatal mistakes—not by blaming individuals but by redesigning their systems: standardizing equipment, using checklists, and flattening hierarchy so nurses could challenge doctors. Their approach, Hallinan suggests, shows that the key to reducing error is not perfection of people but redesign of environments.

The World Isn’t Built for Our Minds

The book goes further: modern life constantly pits our Stone Age brains against systems that assume we remember passwords, multitask efficiently, and absorb complex instructions. Yet humans can only retain about five unrelated items in short-term memory. That’s why pilots forget landing checks and drivers crash while fiddling with GPS devices. When things go wrong, we wrongly blame “human error” rather than poor design—just as Captain Loft did before crashing Eastern Airlines Flight 401 while distracted by a $12 lightbulb.

Learning to See Our Mistakes

Ultimately, Hallinan’s book is about awareness. By understanding how illusions, biases, and limits shape our perception and memory, we can adjust environments to fit our humanity rather than the other way around. Mistakes, he argues, aren’t moral failings; they are the price of being human. Learning from them—like doctors who adopted aviation-style safety checklists or investors who keep journals of their decisions—can help us make fewer or at least more intelligent mistakes.

If Daniel Kahneman’s Thinking, Fast and Slow explains the mechanics of cognitive bias, Hallinan’s Why We Make Mistakes makes that science personal—grounded in vivid stories, quirky research, and relatable confessions. It’s a compassionate reminder that our errors, far from undermining intelligence, reveal the contours of how our astonishingly adaptive—but imperfect—minds really work.


We Look But Don’t Always See

Have you ever searched frantically for an object sitting right in front of you? Hallinan begins with this everyday blindness. In one memorable story, a young Burt Reynolds in a bar punches a man during an argument, only to realize mid-swing that his opponent has no legs. He was looking—but not seeing. Reynolds’s mistake illustrates what psychologists call a “looked-but-didn’t-see” error: we can stare directly at something and fail to register it because attention and vision don’t always line up.

Selective Vision and Change Blindness

Hallinan describes cognitive experiments by Daniel Simons and Daniel Levin at Cornell University. When pedestrians gave directions to a stranger who was covertly swapped out mid-conversation as two people carried a door between them, most never noticed the change. Even when the new person was a different height or wore different clothing, half the participants continued speaking as if nothing had happened. We see less of the world than we think, and our brains fill in gaps with assumptions.

The Expert’s Quiet Eye

Hallinan contrasts experts with novices. A great golfer, for example, doesn’t glance around while putting; instead, she keeps a “quiet eye”—a still, sustained focus just before motion, allowing precise motor programming. Less skilled players flick their gaze from ball to club and back, fragmenting attention. This insight appears across domains: skilled surgeons, marksmen, and musicians exhibit longer, calmer gazes right before action. True excellence, Hallinan suggests, requires disciplined attention rather than frantic multitasking.

Seeing What We Expect

Humans see with their minds as much as with their eyes. We recognize people and objects by categories, not details. In Simons's study, people noticed changes only when the stranger was similar to them in age or social group—our perception sharpens for faces like our own and blurs for “others.” This filtering explains why eyewitnesses are less accurate when identifying people of another race and why categories like “construction worker” or “doctor” override individual traits. As Hallinan says, “We trade visual details for meaning—and we don’t even know it.”

The Cost of Not Seeing

Our perceptual biases aren’t just amusing—they can be deadly. Airport baggage screeners and radiologists miss up to 30% of dangerous items or tumors because they see them so rarely. When the target is rare (like a bomb or cancer), our brains lower the threshold for quitting the search—a phenomenon Hallinan calls the “quitting threshold.” As Harvard’s Jeremy Wolfe told him, “If you don’t find it often, you often don’t find it.” We literally give up before we see.

The lesson is sobering: our eyes provide data, but our brains decide what matters. Seeing requires effort, patience, and awareness of our limits. By slowing down, questioning first impressions, and recognizing that attention is finite, we can at least notice more of what’s actually there—before we, like Burt Reynolds, strike out at the wrong target.


We Search for Meaning and Misremember Details

Why can you recall a childhood friend’s face perfectly but struggle to remember their name? Hallinan explains that this selective memory arises because humans prioritize meaning over detail. Psychologists Harry Bahrick and colleagues showed that decades after graduation, people remembered 73% of classmates’ faces but only 18% of their names. Names are arbitrary; faces carry emotional and social meaning.

The Penny Test and the Power of Meaning

In one famous experiment, researchers asked people to draw a U.S. penny from memory. Only one participant out of twenty reproduced its details accurately. We handle pennies thousands of times but never need to distinguish their fine features. The takeaway: you remember what’s meaningful to your purpose, not what’s objectively present. This insight explains why eyewitnesses misdescribe suspects, why we forget passwords, and why hiding valuables “in a clever spot” almost guarantees you’ll lose them.

Making Sense of the ‘Norman Einstein’ Error

Hallinan retells football legend Joe Theismann’s infamous slip calling “Albert Einstein” “Norman Einstein.” What sounded like ignorance was actually memory’s tendency to retrieve related meanings: Theismann had known a high-school classmate named Norman Einstein who was very smart. His brain grabbed the wrong but semantically related name. These “tip-of-the-tongue” errors happen daily because our mental filing cabinets are sorted by meaning, not sound.

Encoding the Meaningless

Psychologist Hermann Ebbinghaus memorized thousands of nonsense syllables like “DAX” and “QEH” and found he forgot most within hours. But if information had structure or emotional significance, memory improved drastically. This explains the power of mnemonics—and why students remember “Every Good Boy Deserves Fudge” for musical notes decades later. Our minds crave coherence and story.

Recognition and Misidentification

Hallinan’s story of June Siler, a nurse who identified the wrong man as her attacker, drives home the danger of false memory. Eyewitnesses reconstruct identity from emotional traces (“the hate I saw at the bus stop”) rather than visual details. Studies found that 77% of DNA-exonerated prisoners had been convicted due to mistaken eyewitness identification. We remember impressions—anger, menace, familiarity—far more vividly than we remember features.

The upshot: your memory tells coherent stories, not objective truths. When you forget why you hid the gold coin or call someone by the wrong name, it’s not stupidity; it’s your brain doing what it evolved to do—filtering for meaning first, precision later.


We Connect Dots That Aren’t There

Our brains are chronic storytellers. Given scattered data, we weave connections—sometimes brilliantly, often dangerously. Hallinan shows that quick judgments and subtle cues shape our beliefs far more than reason does, creating a stream of invisible errors.

Snap Judgments and Faces

Princeton psychologist Alexander Todorov found that people could predict U.S. election outcomes by looking at candidates’ faces for one second. Voters consistently chose the faces rated as “more competent.” Even West Point cadets with sterner facial expressions rose higher in rank decades later. Appearance, not argument, drives perception. Hallinan warns that such correlations reflect our bias to equate facial cues with ability—a shortcut evolution built to gauge trust quickly but that now misfires in boardrooms and polling booths.

Hidden Signals and Biology

Even subtler cues mislead us. Researchers tracked topless dancers’ income across their menstrual cycles and found they earned far more in tips when most fertile, though neither dancer nor customer consciously noticed. Likewise, shoppers spent more money when stores diffused masculine scents—a reminder that unseen sensory triggers steer behavior. Hallinan’s point isn’t erotic science, but humility: much of what we call choice is physiological reaction dressed up afterward in reasoning.

Price, Color, and Context

Our tendency to connect price with quality is another illusion. In a Stanford experiment, the same wine tasted better when labeled $90 than $10—even activating more pleasure centers in the brain. Similarly, people reported stronger pain relief from a $2.50 placebo than from one costing ten cents. Color also manipulates perception: black sports uniforms led referees to call more penalties, both in real games and staged video tests. We literally see aggression because we associate black with danger.

The Myth of First Instincts

Students are taught “always trust your first answer,” but studies on test-taking show the opposite—changing answers usually improves scores. Why don’t people learn? Because regret biases memory: changing a correct answer to a wrong one feels more painful than staying wrong. We remember the emotional cost, not the statistical truth. Hallinan calls this the “Monty Hall” problem of life: our unwillingness to switch doors even when odds say we should.

In a world overflowing with information, our pattern-hungry brains connect the dots too fast. Recognizing these impulses—seeing that the connections themselves are often the illusion—is the first step toward real understanding.


We Wear Rose-Colored Glasses

Most of us think we remember life as it happened. In truth, we remember it as we wish it had. Hallinan calls this optimistic bias the mind’s built-in photo filter—it beautifies our past and shields our ego. Without realizing it, we revise memories to flatter ourselves and justify our actions.

Self-Enhancement in Memory

At Ohio Wesleyan University, students recalled their high school grades. Nearly 80% inflated them; A’s were remembered accurately, but D’s nearly vanished. Journalists, parents, gamblers—even presidents—exhibit the same rosy reconstruction. Former Nixon aide John Dean testified under oath about his conversations with the president during Watergate. When compared to taped transcripts, almost none of his recollections were correct, yet his version made him appear more central and heroic. Psychologist Ulric Neisser concluded that Dean’s testimony revealed how memory preserves meaning—especially self-serving meaning.

The Hindsight Trap

Knowing outcomes twists our perception of the past. Once we learn a plane crashed or a company failed, we think “it was obvious.” This “hindsight bias,” demonstrated by Baruch Fischhoff, makes us unfair judges of others and bad learners from our own experience. In experiments around Nixon’s trips to China and Russia, students misremembered their initial predictions—rewriting their past beliefs so they looked more prescient than they were. The same mechanism leads gamblers to reframe losses as “near wins” and investors to believe they “almost saw it coming.”

Bias in Decision Makers

Even professionals blind themselves. Doctors deny that free dinners or drug company perks influence them—but think their colleagues are affected. Experiments by George Loewenstein showed otherwise: when financial advisers disclosed a conflict of interest, they felt freer to give biased advice, believing transparency absolved them (“Hey, I warned you”). Clients discounted their suggestions—but not by enough. Disclosure became a moral license for dishonesty.

Hallinan concludes that self-delusion is a human constant. We revise, justify, and inflate because it feels better than facing our fallibility. Awareness can’t erase bias, but it can keep our inner spin doctor in check.


We Can Walk and Chew Gum—but Not Much Else

Multitasking feels efficient but is a biologically impossible myth. Hallinan uses dramatic stories and research to show that when we try to handle several things at once, we degrade performance in each—and risk disaster.

The Tragic Flight of Captain Loft

Eastern Airlines Flight 401 approached Miami when pilots noticed a small light indicating landing-gear trouble. As the crew focused on the $12 bulb, they failed to notice the plane’s slow descent. Five seconds after the captain finally asked, “What’s happening here?” the jet plunged into the Everglades, killing ninety-nine people. The cause: distraction. Engineers even coined a term for such accidents—Controlled Flight Into Terrain—to describe planes flown perfectly into the ground because attention was elsewhere.

The Illusion of Multitasking

Hallinan explains that even computers don’t multitask; they switch rapidly between tasks. The brain, slower and less predictable, can’t manage two conscious decisions at the same time. U.S. Army studies showed that talking on a phone while driving impairs reactions, especially after age forty. At Microsoft, employees interrupted by emails took fifteen minutes to return to full focus. We pay a composure tax: slower thinking, forgotten steps, and more errors.

Attention Overload in Cars

Cars today demand impossible focus. Hallinan catalogs gadgets—GPS touchscreens, DVD players, Bluetooth, night vision—that make driving resemble operating a cockpit. Yet each new “safety” device interrupts attention, ironically creating more danger. The average two-second glance away from the road doubles accident risk; entering a GPS address can take over a minute of distracted time. One bus driver in Washington, D.C., even drove into a bridge while on his cell phone—he missed the bridge entirely.

Hallinan’s conclusion is simple: when your mind switches tracks, your awareness vanishes for precious seconds. The best solution isn’t willpower—it’s design. Like the Air Force’s autopilots or Europe’s phone-blocking systems during lane changes, the answer lies in building environments that respect human limits.


We All Think We’re Above Average

Ask a roomful of people if they’re above-average drivers, and nearly all will raise their hands. Statistically impossible, but psychologically inevitable. Hallinan devotes a lively chapter to our collective overconfidence—the belief that we’re smarter, more ethical, and luckier than we are—and shows how that single bias multiplies mistakes across life, from dieting to Wall Street.

Blind Spots of Overconfidence

Dieters overspend on programs like NutriSystem because they overestimate self-control (“Of course I’ll stick it out”). Gym-goers sign monthly contracts they rarely use, losing hundreds of dollars. Credit card holders choose teaser-rate offers, believing they’ll pay balances off before interest spikes. In each case, expectations outpace behavior. We mistake what we should do for what we will do.

Calibration and Feedback

Certain people, however, learn to align confidence with reality. Weather forecasters, for instance, are remarkably well calibrated: when they predict a 30% chance of rain, it rains roughly that often. Why? Immediate feedback. In contrast, most of us act like bad soldiers on a rifle range—sure we’ll hit every target until the results prove otherwise. Warren Buffett’s humility about his worst investment—the Dexter Shoe disaster—demonstrates why long-term feedback makes some decision-makers wiser instead of cockier.

The Illusion of Control

We even believe we can control chance. Psychologist Ellen Langer showed that Yale students bet more money drawing cards against a sloppily dressed opponent than against a sharp one, feeling subconsciously advantaged. People who correctly “guess” a few coin tosses start thinking they can predict outcomes with practice. Hallinan concludes: the more uncertain the task, the stronger the illusion that effort can bend luck.

Recognizing overconfidence isn’t about modesty—it’s survival. The world’s most successful organizations, like Shell’s geology division, train teams to record both predictions and outcomes, confronting error rather than denying it. Awareness may not make us average—but it keeps us honest.


Why Mistakes Matter

In his conclusion, Hallinan insists that the path to wisdom runs through error. To err is human—but only learning from those errors sets us apart. Understanding why we make mistakes leads to humility and better systems of prevention.

Designing for Human Nature

From color-coded syringes to pilots’ checklists, smart organizations accept that humans mess up under pressure. They build “constraints” and “affordances”—features that make correct actions easier and wrong ones impossible. In contrast, hospitals with hierarchical cultures, where nurses fear questioning surgeons, suffer deadly rates of error. As in aviation, flattening power gradients saves lives.

Turning Error into Insight

Hallinan’s final advice echoes psychologist Carol Dweck’s research on growth mindset: the key isn’t avoiding mistakes but tolerating and analyzing them. Pilots log errors to prevent repeats; successful investors document decisions and track false predictions. By contrast, perfectionists fear mistakes so much they stop learning. Thinking negatively—asking what could go wrong—helps us build resilience before failure, as surgeons at Walter Reed did to reduce combat deaths.

Embracing Our Limits

Human attention, memory, and judgment are bounded, but awareness of those bounds creates strength. As Hallinan puts it, most of our everyday errors—from locking keys in the car to believing we can multitask—reflect conflict between rational intention and visceral impulse. Learning to slow down, seek feedback, and design smarter environments won’t make us perfect—but can make us a little less wrong, a little more often.

In the end, Hallinan’s message is forgiving yet pragmatic: mastery begins not with control, but with curiosity about our flaws. Mistakes don’t just reveal what’s broken about us—they illuminate what it means to be human.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.