The Power of Noticing cover

The Power of Noticing

by Max H Bazerman

The Power of Noticing by Max H Bazerman reveals how vital the ability to notice is for effective decision-making and leadership. Through real-life examples like the Challenger disaster and Hurricane Katrina, Bazerman emphasizes the importance of overcoming biases and blindness to avert crises and enhance leadership effectiveness.

The Power of Noticing: Seeing What Others Miss

How often have you walked into a meeting, made a decision, and only later realized that critical information sat right in front of you—but you never saw it? In The Power of Noticing, Harvard professor Max H. Bazerman asks this piercing question to challenge the way you perceive your world. He argues that success, leadership, and ethics depend not on greater intelligence, but on the ability to notice what others ignore.

Bazerman contends that humans routinely suffer from what he calls bounded awareness — a psychological blind spot that keeps us from seeing salient data, ethical risks, or emerging dangers. Whether it’s an executive overlooking warning signs before a corporate collapse, a regulator ignoring the signals before a financial catastrophe, or ordinary people turning away from unethical behavior to protect their own interests, these failures stem from the same cognitive trap: we act as if “what you see is all there is.”

From Focus to Blindness

Bazerman’s argument begins with a deceptively simple experiment: the famous “basketball video,” where viewers asked to count passes fail to notice a person in a gorilla suit walking through the frame. This phenomenon, known as inattentional blindness, reveals how focus narrows perception. He expands the metaphor—showing that executives, policymakers, and entire institutions focus so tightly on their objectives that they miss the broader context entirely. His own reaction to missing the gorilla becomes a metaphor for professional life: everyone praises focus, but few recognize its hidden cost—blindness to what truly matters.

Why Noticing Matters for Leadership

For Bazerman, noticing isn’t just about spotting problems; it’s about cultivating a moral and strategic advantage. Leaders like Jamie Dimon at JPMorgan Chase or Joe Paterno at Penn State illustrate how blindness destroys careers and institutions. In dimly lit boardrooms or bureaucratic hierarchies, people fail to observe unethical or risky behaviors because they have motivations not to notice. They fear damage to their reputation, loyalty breaches, or career loss. When motivations distort perception, entire organizations can fail to act even in the face of glaring evidence—as with the Sandusky scandal at Penn State or the Catholic Church’s cover-up of abuse. Bazerman’s concept of motivated blindness captures this corrosive dynamic perfectly.

The Psychological Landscape of Blindness

Drawing on decades of behavioral science, Bazerman builds on heroes of the field—Herbert Simon’s bounded rationality, Daniel Kahneman’s System 1 and System 2 thinking, and Amos Tversky’s work on cognitive biases. But he adds a critical dimension: we don’t just misuse data; we fail to see data at all. System 1, our fast, intuitive brain, often blinds us. System 2, our slower, reflective thinking, can rescue us—if we learn to use it deliberately. Through stories like the Challenger space shuttle disaster, the Enron collapse, and the 2008 financial crisis, Bazerman shows how experts, engineers, and leaders missed obvious warning signs simply because they were focused on irrelevant details.

A Blueprint for Better Decision Making

Bazerman’s solution is practical. He urges leaders and individuals to build habits of questioning what isn’t presented. In meetings, ask, “What data do we not have?” In negotiations, consider what your counterpart withholds. In organizations, reward curiosity rather than blind loyalty. Effective noticing, he argues, can prevent predictable surprises—the kinds of disasters everyone “should have seen coming.” Notice the patterns, incentives, and blind spots that shape behavior. Notice the silence in the room. Notice what didn’t happen when it should have.

The Promise of Becoming a “First-Class Noticer”

In his closing chapters, Bazerman borrows the term “first-class noticer” from leadership scholar Warren Bennis. A first-class noticer sees emerging threats before they erupt, recognizes moral failures before they metastasize, and identifies opportunities others overlook. Whether you are a CEO, a teacher, or a citizen, your success depends not just on thinking harder, but on noticing better. The Power of Noticing is both a psychological manual and a moral call to action: broaden your vision, challenge your loyalties, and learn to see the invisible dynamics that shape decisions and behaviors. Because the things we don’t notice—those unasked questions, unchallenged assumptions, and unheard alarms—are often the ones that determine our fate.


Motivated Blindness: The Ethics of Not Seeing

Why did so many responsible adults ignore the crimes of Jerry Sandusky at Penn State? Why did Catholic Church officials overlook decades of abuse? Bazerman’s concept of motivated blindness explains these alarming failures—not as acts of evil, but as human tendencies to turn away from uncomfortable truths when noticing them would threaten self-interest.

How Loyalty and Self-Interest Create Blindness

At Penn State, Sandusky’s crimes were known to many—janitors, coaches, the athletic director, and even the legendary Joe Paterno. Yet none reported him to police. Fear of jeopardizing their jobs or the institution’s reputation stopped them. Each person did what appeared morally acceptable within their narrow frame—reporting upward but not outward. Loyalty became a blinder. Bazerman argues that such blindness is not rare; it is motivated. We often fail to see unethical behavior when it’s not in our interest to notice.

The Church as a Parallel Case

He compares Penn State to the Catholic Church’s massive cover-up of child sexual abuse, notably in Boston under Cardinal Bernard Law. Law, a respected moral leader and civil rights activist, re-assigned known abusers rather than exposing them. His loyalty to the institution and belief in redemption blinded him to the ongoing harm. The Church’s hierarchy, Bazerman notes, institutionalized motivated blindness—defending its behavior against charges by accusing the media of anti-Catholic bias. Loyalty and vested interests, rather than malice, perpetuated silence.

Behavioral Science Behind the Blindness

Drawing on psychological studies, Bazerman explains that people interpret the world through self-serving lenses. We tend to see those we admire as ethical and ignore disconfirming evidence. From football fans who believe their team plays fair (the Princeton-Dartmouth study, 1954) to corporate auditors who overlook client fraud for fear of losing contracts, this bias affects every context. When self-interest and loyalty align, ethical awareness collapses. Even whistleblowers hesitate, knowing the cost of speaking up.

Overcoming Motivated Blindness

Bazerman insists that motivated blindness is universal but surmountable. Leaders must create incentives to notice inconvenient truths. Organizations should reward transparency and establish consequences for ignoring misconduct. On a personal level, you can train yourself to pause and ask, “What am I not seeing because I want things to be easy?” Recognizing your motivations—career, loyalty, reputation—helps expose what lies hidden behind them. The courage to notice, Bazerman concludes, can transform both ethics and leadership.


Industrywide Blindness: When Systems Corrupt Perception

What happens when an entire profession fails to notice its own conflicts of interest? Bazerman explores “industrywide blindness,” where built-in economic and social incentives make corruption invisible to those inside the system. His examples—from auditors at Arthur Andersen to researchers fabricating data—expose how industries normalize unethical practices while believing they are acting with integrity.

Auditing Without Independence

In theory, auditors exist to ensure companies report their finances honestly. In practice, they rely on those very firms for lucrative contracts and consulting fees. The result: audits that protect profits instead of truth. Bazerman’s testimony to the Public Company Accounting Oversight Board highlighted how auditor independence had become a fiction since the Enron scandal. Auditors, he explains, are human—they unconsciously filter data in ways favorable to clients who pay their bills. Independence collapses under economic pressure.

Academia’s Hidden Biases

Scientific fraud may be rare, but subtle unethical practices persist. Bazerman cites the cases of psychologists Marc Hauser and Diederik Stapel, whose data fabrication shocked academia. Yet beyond individual deceit lies a systemic issue: researchers routinely manipulate experiments to achieve publishable results—“p-hacking,” rounding data, and omitting trials. In a culture where careers depend on publication, questionable methods spread like contagion. The field rewards outcomes, not accuracy, creating blindness that compromises scientific truth.

The Power of Incentives

Bazerman extends the same logic to credit-rating agencies—whose job was to assess risk but whose profits depended on leniency. Paid by the firms they rated, agencies gave AAA marks to toxic mortgage securities before the 2008 crash (mirroring auditors’ dependency). Incentives determine what gets noticed. When seeing the problem threatens profits, industries simply stop looking. Reflecting on these parallels, Bazerman argues that systems of reward often override ethical awareness.

Fixing the System

To restore sight, Bazerman proposes structural reforms: rotate auditors, ban consulting for audit clients, make research data transparent, and hold credit-rating agencies accountable to investors rather than issuers. Ethical integrity cannot rely on individual virtue alone. It must be embedded in the architecture of incentives. In industries from accounting to science, noticing corruption means redesigning the system so honesty isn’t punished and silence isn’t rewarded.


Missing the Obvious on the Slippery Slope

One of Bazerman’s most troubling insights is the “slippery slope” effect—how small ethical compromises gradually lead to catastrophic failures. He uses stories like Bernie Madoff’s Ponzi scheme and the Challenger space shuttle to show that people rarely leap into corruption; they slide into it, inch by inch.

The Science of Gradual Decline

Research on “change blindness” reveals that humans fail to notice slow alterations in their environment. Bazerman and colleague Francesca Gino applied this to ethics: people are more tolerant of wrongdoing that unfolds gradually. In their experiments, participants approved increasingly inflated estimates without realizing how unethical their acceptance had become. The slippery slope effect explains how auditors missed Enron’s fraud and how traders like Kweku Adoboli at UBS escalated from small concealments to billion-dollar losses.

From Overconfidence to Escalation

Overconfidence ignites the slope. Executives believe that bending rules is harmless, confident future success will justify minor misreporting. When results don’t improve, they double down, manipulating data further to hide past actions. Catherine Schrand and Sarah Zechman’s study of SEC cases revealed this pattern repeatedly: most fraud began as optimistic adjustments that snowballed into deliberate deceit. Each round of manipulation feels rational—until they’re trapped.

Real-World Downfalls

Bazerman recounts the fall of Bernard Bradstreet, CEO of Kurzweil Applied Intelligence, who moved from creative accounting to outright fabrication. Similarly, politicians like Bill Clinton escalated ethical errors—his denial of the Lewinsky affair became a case study in how cover-ups compound damage. Slippery slopes are not merely personal weaknesses but organizational ones: banks failing to notice rogue traders, boards ignoring financial anomalies, and teams rationalizing each step deeper into deceit.

Stopping the Slide

Bazerman urges vigilance: leaders must watch for gradual deviations, not just blatant misconduct. Ethical auditing, transparent reporting, and a culture of questioning make slippery slopes visible before they become avalanches. For individuals, reflective pauses and accountability partners can curb incremental wrongdoing. As he reminds us, most ethical failures are not leaps of evil but steps of human reasoning that spiral out of control.


Seeing What Didn’t Happen: The Sherlock Holmes Principle

In his story “Silver Blaze,” Sherlock Holmes solved a murder by noticing the dog that didn’t bark. For Bazerman, this kind of awareness—seeing what’s absent—is the heart of effective leadership. Missing information often tells the most crucial story. He invites you to become Holmes-like: look not just for what is happening, but for what should be happening but isn’t.

Errors of Omission

People fear harm from action more than harm from inaction. Psychologists Ilana Ritov and Jon Baron proved that when given a choice between getting a vaccine with slight risk or doing nothing, most chose inaction—even when it led to worse outcomes. Bazerman applies this omission bias to real-world decisions: governments fail to adopt life-saving organ donation policies because they notice emotional objections to action but ignore the silent deaths caused by doing nothing.

Unnoticed Inequities

He extends the idea to universities’ legacy admissions. While affirmative action for disadvantaged groups draws public scrutiny, “affirmative action for the wealthy”—preferential treatment for alumni children—remains largely unnoticed. The rejected students who might have been admitted never “bark”; they don’t know they were displaced. Bazerman prompts readers to imagine if universities publicly listed who lost admission to make space for legacies—outrage would ensue, and reforms would follow.

Learning from What Isn’t There

To train yourself as a noticer, Bazerman suggests mental exercises: when analyzing a problem, ask a Holmesian question—“What’s missing?” In negotiations or investigations, consider what your counterpart fails to mention. In life, notice the silence of those who should speak up. What isn’t said can reveal more than any confession. Seeing the dog that didn’t bark transforms perception into foresight.

Like Holmes, first-class noticers detect patterns others dismiss. They read blank spaces, unraised voices, and absent evidence. In doing so, they prevent crises others never saw coming.


Thinking Ahead: Strategy Beyond the Moment

Bazerman’s ninth chapter shifts from noticing past failures to anticipating future ones. Thinking ahead—foreseeing consequences and the reactions of others—is an advanced form of noticing. It prevents predictable mistakes by expanding perspective beyond the immediate horizon.

Lessons from BP and Susan G. Komen

CEO Tony Hayward’s infamous comment after BP’s 2010 oil spill—“I’d like my life back”—illustrates the cost of failing to think one step ahead. In moments of crisis, leaders must imagine how words or actions appear to those most affected. Similarly, when the Susan G. Komen Foundation defunded Planned Parenthood, it ignored the overlap between its donors and women’s health advocates. Public outrage was inevitable. Both decisions reveal how myopia, not malice, can destroy reputations.

The Game Theory of Perception

Bazerman connects thinking ahead to game theory—the strategic practice of anticipating others’ moves. His “Acquiring a Company” problem shows that failing to consider what information the other party holds leads to self-defeating offers. The rational price, paradoxically, is zero. Buyers who think only of averages, not conditional responses, lose every time. True foresight means analyzing incentives, information asymmetry, and future reactions before acting.

Balancing Cynicism and Trust

Bazerman’s own story—a taxi driver lying about a strike—illustrates useful cynicism. Trust should be earned, not assumed. Yet too much cynicism, he warns, blinds us to genuine collaboration. In experiments where buyers communicated with sellers, suspicion often reduced profits. Effective noticing means finding equilibrium between skepticism and openness, guided by strategic empathy: putting yourself in others’ shoes to understand their motives.

From Anticipation to Leadership

Thinking ahead transforms reactionary leadership into proactive leadership. It requires reflective pauses, scenario planning, and moral imagination—the ability to foresee how decisions affect others tomorrow, not just today. Whether preventing crises or crafting ethical policies, Bazerman teaches that good leadership isn’t just seeing what’s in front of you, but envisioning what lies beyond the next turn.


Leadership and Predictable Surprises

Why do disasters that could have been prevented keep catching leaders off guard? Building on his earlier book Predictable Surprises, Bazerman explores why governments and corporations fail to act on clear warnings—from Hurricane Katrina to the 2008 financial meltdown.

Cognitive, Organizational, and Political Blindness

Leaders avoid inconvenient truths for three reasons: optimism, bureaucracy, and politics. Cognitive biases promote positive illusions (“it won’t happen to us”). Organizational silos fragment information so no one sees the whole picture. Political calculations discourage costly preventive action. Fixing New Orleans’s levees or improving airport screening before 9/11 required spending money now to avoid invisible future losses—anathema to short-term thinkers.

Recognizing, Prioritizing, and Mobilizing

Successful leaders, Bazerman insists, recognize looming threats early, prioritize them through cost-benefit analysis, and mobilize resources to mitigate harm. Barack Obama’s efficient response to Hurricane Sandy contrasts sharply with the disarray of Katrina—proof that foresight, communication, and readiness stem from noticing patterns before disaster strikes.

Ethical Accountability

For Bazerman, predictable surprises aren’t just policy failures—they’re moral ones. When leaders ignore foreseeable dangers (like climate change or financial bubbles), they fail their duty to notice and act. True leadership demands attention to the unseen—the early tremors of crisis. Only by institutionalizing accountability for noticing can organizations avoid reliving the same preventable catastrophes.


Developing the Capacity to Notice

Bazerman concludes with practical wisdom: becoming a “first-class noticer” isn’t an innate talent—it’s a discipline you can cultivate. Drawing on Warren Bennis’s leadership philosophy, he shows how noticing sharpens decision-making, innovation, and moral clarity.

Adopting a Noticing Mindset

Start by reframing failures as internal, not external. When crises hit, most people blame circumstances. First-class noticers, however, ask, “What didn’t I do that I could have?” This mindset transforms errors into insight. It encourages humility and continuous learning—the foundation of leadership growth.

Questioning Conventional Wisdom

Bazerman draws inspiration from Michael Lewis’s Moneyball: Billy Beane revolutionized baseball by noticing overlooked data. You can apply the same principle—challenge traditions, spot inefficiencies, and ask “why not?” when facing constraints. Innovation, Bazerman notes, often arises from noticing what others consider impossible.

Outsider Thinking and Organizational Design

Sometimes outsiders see clearly what insiders miss. To notice more, invite external perspectives or imagine yourself as an outsider to your own problem. Leaders must also design systems that encourage noticing—like rewarding employees for identifying risks rather than punishing them for dissent. He calls this being a “noticing architect.”

The Balance of Focus and Awareness

Bazerman closes with a gentle paradox: focus is essential, but awareness is wiser. To lead effectively, occasionally remove your blinders and look around. Every leader must learn when to concentrate and when to notice. In doing so, you elevate not only your decisions but also your entire organization.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.