Mistakes Were Made (But Not by Me) cover

Mistakes Were Made (But Not by Me)

by Carol Tavris and Elliot Aronson

Delve into the psychology of self-justification and its impact on our lives. ''Mistakes Were Made (But Not by Me)'' explores how admitting errors can mend relationships, improve decision-making, and foster personal and professional growth, while offering insights into overcoming cognitive biases.

Why We Justify: The Psychology of Self-Deception

Why is it so hard to say the simple words “I was wrong”? In Mistakes Were Made (But Not by Me), psychologists Carol Tavris and Elliot Aronson unravel this timeless puzzle by diving into one of the mind’s most powerful mechanisms: self-justification. They argue that most people, no matter how intelligent or well-intentioned, are wired to rationalize their actions and beliefs, turning even our worst mistakes into stories that make us feel right. This book isn’t just about why politicians dodge responsibility—it’s about why you argue longer than you should, hold on to bad investments, or struggle to admit you hurt someone you love.

Tavris and Aronson contend that the real danger of human error isn’t making mistakes—it’s justifying them. When we justify, we protect our self-image as moral, competent, and kind. But this protection comes at a cost: it blinds us to evidence, deepens our biases, and distances us from others. Understanding this process, they suggest, is the first step toward humility, better decision-making, and genuine self-correction.

Self-Justification: The Armor of the Ego

The book opens with familiar political apologies—“mistakes were made”—that dodge personal responsibility. But the authors quickly pivot, revealing that the same psychological defenses operate in our daily lives. Whether you’ve stayed in a bad job because you’ve “come this far,” justified a moral lapse as “necessary,” or redefined your partner’s flaws as virtues, you’ve experienced self-justification at work. It guards your self-esteem by rewriting reality to fit your story. What makes this mechanism tricky is that it’s rarely conscious—you believe your revised story sincerely.

This human tendency has both benefits and dangers. On one hand, it protects us from crippling self-doubt and shame. On the other, it perpetuates delusion and prevents growth. “Without self-justification,” Tavris and Aronson note, “we couldn’t sleep at night.” But with too much of it, “we can’t learn from experience.”

The Engine Beneath Our Rationalizations: Cognitive Dissonance

At the heart of the book lies Leon Festinger’s theory of cognitive dissonance—the tension that arises when you hold two contradictory beliefs or behave in ways that conflict with your values. To ease that discomfort, you don’t usually change your behavior—you change your beliefs to justify it. A smoker says, “We all have to die of something,” or a cheating student decides the test was unfair anyway. The stronger your commitment or investment, the greater your need to justify it.

Through vivid stories—from doomsday cults doubling down after failed prophecies to presidents defending disastrous wars—the authors show how dissonance fuels stubbornness. People don’t just defend their decisions; they come to believe them more deeply once challenged. As George Orwell cynically put it, “We are all capable of believing things we know to be untrue, and then, when proved wrong, twisting the facts to make ourselves right.”

How the Mind Edits Reality

Once set in motion, dissonance distorts how we see the world. We notice evidence that supports our view and dismiss what doesn’t—a process known as confirmation bias. The authors illustrate this in diverse examples: from music experts defending overpriced Stradivarius violins to citizens clinging to political myths. As neuroscience now confirms, when presented with dissonant information, our “reasoning” brain literally shuts down while our emotional circuits activate. This means bias isn’t just stubbornness—it’s biology.

This mechanism explains why fanatics, experts, and ordinary citizens can maintain wildly wrong beliefs. Whether it’s a cultist convinced that failed predictions “saved the world” or economists denying failed forecasts, every justification reinforces identity: “I’m smart, moral, and consistent.”

The Everyday Fallout of Justifying Ourselves

The authors extend this reasoning beyond politics and academia into love, law, and everyday life. We may justify staying in a harmful relationship (“at least we’re trying”) or a dead-end job (“I can’t waste what I’ve invested”). We misremember events to favor ourselves, claiming partners were more to blame or our share of housework higher than it really was. Over time, memory becomes not a recording of reality but a story that safeguards our self-concept. In relationships, self-justification masks cruelty as righteousness; in the justice system, it sustains wrongful convictions by trapping prosecutors and police in loops of certainty.

Even altruism gets twisted. We justify our aggression (“they deserved it”) and our favoritism (“they earned it”) through mental stories that let us keep feeling good. Once again, the goal isn’t truth—it’s comfort.

Climbing Down from the Pyramid of Certainty

Tavris and Aronson invoke their powerful metaphor of the “pyramid of choice.” Imagine two people faced with the same moral dilemma—say, to cheat or not to cheat. A single decision sends them down different sides of the pyramid. Each step of justification carries them further from each other until one sees himself as virtuous, the other as unfairly maligned. Over time, small justifications harden into identities: honest person versus opportunist, loyal employee versus rebel, believer versus traitor. The descent feels rational, but it’s paved with self-deception.

Recognizing this pattern, say the authors, is the key to freedom. Awareness of dissonance doesn’t erase bias, but it interrupts the chain reaction. Once you can notice when you’re justifying instead of learning, you can step back—before sliding all the way down.

Why This Matters Now

In an era where polarized politics, misinformation, and outrage dominate the public sphere, this book reads like a mirror held up to human thinking. Tavris and Aronson aren’t merely exploring why presidents can’t say sorry—they’re diagnosing why civilizations repeat mistakes. Whether we’re justifying wars, pseudoscience, or personal betrayals, self-justification blinds us to the very truths that could save us. The antidote is humility, curiosity, and what they call the courage of self-examination: the willingness to say, “Maybe I’m wrong.”

By the end, you realize that self-justification isn’t just a flaw—it’s a feature of being human. But by understanding how it works, you can weaken its grip and become, as they put it, a little wiser, kinder, and more honest—with others and yourself.


Cognitive Dissonance: The Pain of Contradiction

At the core of Tavris and Aronson’s argument is Leon Festinger’s discovery of cognitive dissonance—the mental distress triggered when your beliefs and actions clash. The classic example comes from Festinger’s infiltration of a doomsday cult that predicted the world’s end in 1954. When the apocalypse failed to arrive, the members didn’t abandon faith; they evangelized harder, claiming their prayers had saved humanity. Why? Because admitting they’d been wrong would shatter their self-image as rational, faithful people. Instead, they rewrote reality to fit their investment of time, identity, and sacrifice.

How the Brain Soothes Dissonance

When two conflicting cognitions collide—like “I’m a smart person” and “I just did something dumb”—your mind works overtime to restore harmony. That can mean changing behavior (“I’ll quit smoking”), but more often it means altering beliefs (“Smoking helps me relax, and stress is worse for my health”). Neuroscience now shows that when confronted with disconfirming evidence, reasoning areas of the brain go quiet while emotional regions light up; we literally defend our worldview as if under attack.

Imagine being a violinist who treasures your $5 million Stradivarius—then a study reveals that modern violins sound just as good, if not better. Rather than sell your prized instrument, you’ll question the study’s conditions, the researchers’ credibility, or even the listeners’ taste. Dissonance requires resolution, and “the evidence must be wrong” is often the easiest route.

The Effort Justification Trap

One of Elliot Aronson’s most famous experiments demonstrated the “justification of effort.” Students who endured a severe, embarrassing initiation to join a dull discussion group rated it far more interesting than students who breezed in. The harder something costs you—pain, time, or humiliation—the more you’ll convince yourself it was worth it. Real-world versions include brutal hazing rituals, expensive degrees, and rites of passage worldwide, from college fraternities to Hindu festivals where physical suffering strengthens devotion. The message is consistent: what you pay for becomes precious, not because it is but because you need it to be.

When Believing Becomes Seeing

The authors capture this paradox perfectly: humans don’t “see is believing”—we “believe, then see.” Whether voters defending their party’s scandals, citizens rationalizing failed wars, or professionals clinging to bad theories, people mold facts around feelings. The greater your emotional investment, the harder you twist reality. Even absence of evidence becomes evidence—“the fact that we haven’t found any conspirators proves how clever they are.” Dissonance isn’t stupidity; it’s emotional survival.

For readers, understanding dissonance isn’t just academic—it’s a skill. The next time you feel an inexplicable wave of defensiveness, pause. That discomfort is dissonance. If you can notice it and resist the urge to explain it away, you’ve already begun rewriting fewer lies.


Memory: The Self-Serving Historian

Our minds don’t just justify in the present—they revise the past to stay consistent. Tavris and Aronson describe memory as a storyteller, not a video recorder. When we recall events, we reconstruct them in ways that preserve our self-esteem. Over time, even small distortions accumulate into confident falsehoods. This is why spouses’ estimates of housework always exceed 100%, or why students “remember” doing more studying than they did. We aren’t lying; we’re editing history to fit who we believe we are.

How Memory Shapes Morality

Self-justifying memory protects your moral identity. You don’t recall yourself as cruel, but as reacting to provocation. You forget slights you caused but recall clearly those done to you. Over time, this asymmetry builds distance and resentment in relationships. The authors note that this mechanism explains why feuds persist; each side’s selective memory corroborates its innocence and the other’s guilt. The result: objective truth collapses under the weight of personal narratives.

Why Institutions Forget Too

It’s not just individuals who misremember. Groups—police departments, governments, corporations—develop institutional memories that edit out wrongdoing. The more prestigious the institution, the stronger its incentive to defend tradition over truth. This is how unscientific medical practices persisted long after being disproved, or how law enforcement clings to flawed methods despite exonerating DNA evidence. Memory bias, collective or personal, keeps mistakes alive precisely because they’re hard to remember accurately.

Seeing your mind as an editor rather than a historian can be liberating. Instead of defending the unchangeable past, you can reexamine it with compassion and realism—turning memory into an engine of growth rather than justification.


The Pyramid of Choice: From Gray Areas to Certainty

Perhaps the book’s most vivid metaphor is the pyramid of choice. Picture two people at the top, facing an ethical dilemma: to cheat or not to cheat. Their choices diverge slightly, but then each justifies their path. The cheater decides “everyone cheats”; the other decides “cheaters are immoral.” By the time they reach the base, their views are miles apart. That’s how seemingly small decisions shape character. You don’t become corrupt overnight; you justify your way there step by step.

Micro-Moral Decisions

Tavris and Aronson emphasize that the earliest rationalizations are the most dangerous because they feel insignificant. Each act of self-justification makes the next compromise easier. Jeb Magruder didn’t join Nixon’s Watergate scandal planning to lie under oath. He simply wanted to “be part of history.” One rationalized decision led to another until he found himself committing crimes he once would have condemned. The same mechanism underlies everyday corruption—from small expenses fudged to relationship betrayals defended as “exceptions.”

Reversing the Slide

Once you’ve descended the pyramid, reversing direction means facing dissonance head-on. That’s why moral progress often comes only after collapse—addictions, scandals, or personal wake-up calls. Yet awareness of the pyramid can act as a preventive lens. When faced with a gray-area choice, imagine where that first slide could lead. This mindfulness helps preserve integrity when immediate rewards tempt compromise.

Recognizing that others live on opposite sides of their own pyramids fosters empathy too. When you understand that their certainty stems from repeated justifications, not inherent evil, dialogue becomes possible again.


Aggression, Kindness, and the Cycle of Justification

One of the book’s most unsettling insights is that justifying harm intensifies harm. A child who bullies another feels dissonance—“I’m not cruel, so he must deserve it.” This reasoning strengthens with each act, creating a spiral of increasing aggression. Experiments show that venting anger, far from relieving tension, actually increases it because the aggressor must rationalize their hostility afterward. When Kahn’s study allowed students to report an insulting technician, their hostility and blood pressure rose instead of falling—they convinced themselves he deserved the punishment.

The Vicious and Virtuous Circles

Fortunately, the same psychology can produce compassion. If you do a favor for someone, you tend to like them more because you must justify your generosity: “I wouldn’t help a bad person.” Benjamin Franklin used this principle to turn enemies into allies by asking small favors, and experiments confirm it works. With children, letting them choose to share toys or stickers increases their future generosity—their actions redefine them as kind.

From Hate to Humanity

Cognitive dissonance explains both prejudice and reconciliation. When you harm or dehumanize others, you must justify it by seeing them as less worthy—fuel for racism, war, and exploitation. But when you act kindly toward someone outside your group, your mind flips the script to preserve consonance: “They must not be so different after all.” Tavris and Aronson argue that fostering cross-group cooperation works partly because action reshapes perception. Doing good leads to thinking good.

Understanding this loop hands you a quiet power: each small choice—to empathize or vilify—builds a narrative about who you are. And that story, repeated, becomes your character.


Escaping the Trap: Owning Mistakes with Mindful Humility

If self-justification is universal, is there any escape? Tavris and Aronson say yes—but it requires conscious effort. The solution isn’t perfection; it’s awareness. You can’t remove dissonance, but you can manage your response to it. The process begins by naming what you feel when challenged: that jolt of defensiveness is your mind protecting its identity. Instead of arguing, you can pause and ask, “What if the other side has a point?” This simple question cracks the armor of certainty.

From Rationalization to Reflection

Mindful reflection transforms mistakes into learning. Rather than spiraling into blame or shame, acknowledge both your intention and your harm. The authors close with compassion for those who can’t justify their errors—people haunted by guilt and failure. There’s a middle path, they suggest, between ruthless self-criticism and blind self-defense: owning the mistake, then moving forward. The same principle applies to nations, institutions, and couples as much as to individuals.

The Courage to Be Uncertain

True wisdom, the authors argue, is intellectual humility—the courage to accept ambiguity. Columnist William Safire exemplified this when he publicly criticized a political ally for the same secrecy he’d condemned in an opponent. It felt uncomfortable, but it preserved integrity. That discomfort is the price of honesty. Learning to tolerate it is the psychological equivalent of strength training for conscience.

Ultimately, saying “I was wrong” isn’t weakness—it’s evidence of growth. Admitting error frees you from the exhausting work of maintaining illusions. It clears the mental space where real change begins. And that, argue Tavris and Aronson, is not only how you live more honestly—it’s how you live better.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.