Not Born Yesterday cover

Not Born Yesterday

by Hugo Mercier

Not Born Yesterday debunks the myth of human gullibility by unveiling our evolved cognitive defenses against misinformation. Through studies and anecdotes, Hugo Mercier reveals how intuition and shared goals empower us to discern truth, fostering resilience against deception and enhancing our trust in the right sources.

The Adaptive Balance of Trust and Skepticism

Humans are often dismissed as gullible creatures overwhelmed by propaganda, fake news, and conspiracy theories. Yet Hugo Mercier asks you to consider a different story: our minds are evolutionarily equipped for open vigilance—a delicate balance of receptivity and skepticism that allows communication to work without collapsing into deceit. In his view (drawing on work with Dan Sperber on epistemic vigilance), you are neither credulous nor cynical by default but an adept evaluator of who to trust, what to believe, and when to reject a message.

The evolutionary logic of communication

Communication is costly and risky because interests often diverge between senders and receivers. Natural selection therefore shaped mechanisms that protect you from deception while preserving the benefits of learning from others. Animal communication illustrates the rule: gazelles stot to advertise fitness credibly, bowerbirds’ decorations are policed by rivals to keep signals honest, and bees’ waggle dances remain reliable because colony members share reproductive interests. In humans, costly punishment, reputation, and repeated interaction impose similar constraints—turning language, otherwise cheap, into a stable, honest medium.

Why gullibility is rare

If people were broadly gullible, communication systems would collapse under exploitation. Instead, you practice constant checks: comparing new claims to background beliefs (plausibility), evaluating argument quality (reasoning), judging source competence and incentives, and adjusting for emotional context. These filters evolve early—toddlers already prefer testimony backed by evidence and track the accuracy of informants. Most attempts at mass persuasion fail for this reason: receivers ignore low-credibility claims or integrate them shallowly without behavioral commitment.

Where things go wrong

Your vigilance operates on priors rooted in everyday experience; when reasoning enters unfamiliar domains (pathogens, cosmology, macroeconomics), intuitive models can misfire. You may find “design” explanations of life or vaccine fears plausible because ancestral causal instincts overgeneralize. The informational explosion of modern life further strains vigilance: you must judge distant experts and institutions whose honesty you cannot verify directly. Mistakes follow not from credulity but from heuristics stretched beyond ancestral contexts.

Beliefs as social and emotional tools

Mercier unites findings across anthropology, psychology, and history to show that many widespread falsehoods—rumors, fake news, extremist statements—serve social rather than epistemic goals. Rumors and conspiracy tales often provide moral drama and identity signaling instead of practical guidance. Likewise, outrageously self-incriminating declarations (from North Korean panegyrics to online extremism) signal loyalty by burning bridges with rival groups. Such claims may look irrational but perform the adaptive function of demonstrating commitment.

Why reflection doesn’t always reach intuition

You can sincerely endorse counterintuitive doctrines—scientific, religious, or ideological—without those beliefs guiding behavior. People compartmentalize: physicists still show folk-physics errors, believers picture God anthropomorphically despite theological sophistication. These reflective beliefs persist because you trust perceived experts, not because your intuitive systems are replaced. Hence, credence in complex doctrines is often shallow; what matters is the reliability of the chains of trust that transmit them.

The moral of open vigilance

Taken together, these ideas reframe gullibility as a myth. Your cognitive architecture evolved for selective openness: motivated by argument, supported by institutional feedback, and sensitive to incentives. Falsehoods spread not because minds are broken but because communication environments sometimes remove the feedback that normally corrects error—opaque hierarchies, politicized science, or attention-driven media. The practical lesson is civic as well as cognitive: strengthen institutions that make honesty payoff-visible, maintain feedback loops for claims, and engage disagreement with reason rather than disdain. Most of the time, human communication works remarkably well precisely because you are not easy to fool.


Evolution of Honest Communication

Mercier grounds the story of open vigilance in evolutionary dynamics: honest communication survived because receivers evolved skepticism proportional to the risks of deception. In ecosystems and societies where interests diverge—predator vs. prey, rivals or competitors—false signals get weeded out by countermeasures. As in animal signaling theory, truth emerges when lying is made too costly to sustain.

How selection weeds out dishonesty

Gazelles leap flamboyantly when spotting predators, a display too taxing for weak individuals to fake. Arab babblers issue genuine alarms only when a predator appears, lest repeated false alerts destroy credibility. Among bowerbirds, rival males vandalize exaggerated displays, enforcing honesty through sabotage. Human language is cheap in physical cost but heavy in social cost: speakers who lie lose reputation, cooperation, and long-term partners. This reputational bookkeeping substitutes for physical constraint, yielding a stable ecology of mostly reliable talk.

Costly signaling versus social enforcement

Mercier suggests that humans achieve “costly signaling for free.” Instead of burning resources, we outsource policing to social networks. Lies are punished through gossip and exclusion; diligence and transparency are rewarded. When journalists, scientists, or friends report truthfully, they preserve reputational capital that compounds over time. Conversely, liars may profit briefly but are soon filtered out of cooperative circles. This evolutionary stability explains why systematic gullibility would be maladaptive: communicative ecosystems would collapse if people constantly failed to detect deceit.

Reputation as evolutionary glue

Honesty persists because humans track trustworthiness relentlessly. Ostracism, loss of credibility, or reduced cooperation serve as fitness penalties. Institutions magnify these feedback loops—legal contracts, academic transparency standards, and journalistic norms make it costly to mislead large audiences. Every credible sender thus stands on a social platform stabilized by vigilance. Far from promoting gullibility, evolution makes you a discriminating receiver who demands reasons and checks sources before accepting claims.


Reasoning, Plausibility, and Bias

If evolution gave you the ability to detect deception, cognition provides the tools to assess truth. Mercier highlights two key processes—plausibility checking and reasoning—that together explain why most communication succeeds without blind faith. You constantly compare new statements to what you know and use reasoning to extend understanding beyond intuition.

Checking the plausible

Every claim you hear meets a filter of plausibility. When a friend asserts that the moon is cheese, prior physics knowledge forces rejection. This mechanism is protective: it spares you from constantly validating absurdities. Empirically, backfire effects—people doubling down when corrected—are rarer than thought. Usually, individuals calibrate beliefs moderately toward quality evidence (the Condorcet-type updating Mercier references). Priors are armor, not cages.

Reasoning as social exchange

Reasoning refines beliefs cooperatively. As thinkers like Dan Sperber and Mercier argued earlier in The Enigma of Reason, reason evolved to win others’ agreement by crafting and judging arguments. This interactional logic explains why hearing a solid explanation—Gödel’s proof, Darwin’s mechanism—can change minds even when priors resist. The act of argument supplies inferential bridges individuals could not build alone. Reasoning is not cold logic but a social negotiation tool that enables collective cognition.

When intuition misfires

Your plausibility filters draw on ancestral intuitions meant for visible, local contexts; applied to abstract science they misguide. Causal intuitions treat life as designed, contagion as moral contamination—hence the intuitive appeal of creationism or vaccine hesitancy. Correcting these errors requires extended argumentative engagement, not soundbites. Persuasion through reasoning works best when interlocutors share background knowledge and motives for understanding. In that setting, argumentation remains humanity’s finest antidote to misinformation.


Trust, Expertise, and Incentives

Trust decisions hinge on cues of competence, access, diligence, and aligned incentives. Mercier argues that you are a surprisingly skilled evaluator of expertise. Even young children prefer informants who have direct evidence, consistent accuracy, and majority backing. Adults extend the same heuristics through reputational systems, institutions, and peer review.

Recognizing expertise

You naturally ask four questions about any claim: Does the source have privileged access? Have they been right before? Are others independent experts in agreement? Do they have something to gain from misleading you? Such judgments correlate strongly with truth-tracking performance across cultures. Communities from the Hadza to corporate teams identify skill within relevant domains efficiently. Trouble arises when signals of expertise—titles, affiliations, jargon—substitute for genuine domain success, inviting the 'guru effect.'

Incentives and reputations

Liars seldom reveal themselves through body language; reliable cues come from motivation. You trust speakers whose incentives align with yours or whose reputation makes deception costly. Overconfident witnesses who err lose credibility faster than cautious truth-tellers (Tenney’s findings). Evaluating diligence may matter more than detecting intent—negligence misleads as effectively as deceit but is easier to fix through institutional design. Thus, accountability systems that reward transparency and penalize carelessness sustain honesty over time.

Why institutions matter

Proper incentives create a self-correcting communication order: scientists submit to replication, journalists to public scrutiny, financiers to audits. When feedback breaks—under tyranny or monopoly—gullibility becomes dangerous (as Mao’s embrace of Lysenkoism showed). Strengthening reputational checks is therefore a civic act, ensuring that trust remains rational rather than blind faith.


Emotion, Contagion, and Collective Behavior

You may picture crowds as irrational swarms, emotions as contagious like viruses. Mercier dismantles both metaphors. Emotional sharing is real but regulated: you mimic others’ affect automatically yet interpret whether it’s appropriate before acting. This emotional vigilance complements cognitive vigilance, keeping social influence adaptive rather than chaotic.

Emotional contagion with filters

Unlike pathogens, emotions spread selectively. Infants consult caregivers’ expressions only when uncertain; Tamis-LeMonda’s slope experiments show toddlers ignore exaggerated fear on safe paths. Adults weigh justification and context too—empathy declines when rivalry is expected. The emotional channel is open but not unguarded. This distinction explains why manipulation through emotion is difficult: audiences resist when cues or motives conflict with context.

Crowds as cooperative systems

Le Bon’s nineteenth-century vision of mobs as irrational herds crumbles under data. Historical and modern analyses—from French revolts to 9/11 evacuations—show crowds act rationally and prosocially under pressure. Panic is rare, coordination common. Emotions guide action but remain bounded by goals and moral assessment. You interpret others’ fear or outrage through meaning, not mirroring reflexes, making human crowds vastly more intelligent than folklore assumes.

The larger lesson

Human influence spreads through evaluated information and justified feeling, not mindless contagion. Recognizing this safeguards public discourse: emotional appeals work only when they resonate with prior values and trusted sources. Emotional vigilance makes you neither cold nor manipulable but contextually responsive—a design that keeps group life cooperative and resilient.


Rumors, Justifications, and Polarization

Rumors and fake news reveal an essential truth about belief: people often spread claims that serve social or justificatory functions more than informational ones. Mercier reframes misinformation not as infection but as post‑hoc rationalization—beliefs follow practice as much as cause it. You act first, then adopt stories that make your actions seem right or loyal.

When rumors track truth

High-stakes local gossip—about jobs, troop movements, or neighborhood danger—tends to be accurate because relevance, feedback, and reputational risks are high. WWII soldiers spread reliable deployment news because false rumors would backfire immediately. Verification thrives when costs and payoffs align with truth.

When rumors and fake news mislead

By contrast, low-stakes or morally flashy rumors (“elite conspiracies,” “kidnapping rings”) propagate because they entertain and mark belonging. Most holders act as if they don’t wholly believe them—reflective assent rather than behavioral conviction. Mass hysteria cases from Orléans to Pizzagate show this split: huge online endorsement but almost no action. Rumor serves to bind identity, not to plan behavior.

Beliefs as after-the-fact rationalizations

From bloodletting’s theories following practice to political propaganda justifying votes or violence, belief often clothes behavior already underway. The Kishinev pogrom’s blood-libel rumor functioned as moral license, not trigger. Modern fake news similarly comforts precommitted voters seeking one more reason for preferences formed along identity or grievance lines.

Polarization through justification accumulation

When groups continually collect reasons for their stance, those reasons accumulate into exaggeration. Group discussions amplify convictions because each member contributes same‑sided arguments, yielding what social psychologists call polarization. Online echo chambers supply convenient material, but homogeneous identity cues matter more than algorithms. The mechanism isn’t mindless contagion but motivated reasoning that fortifies loyalty.

Practical inoculations

You can reduce distortion by demanding sources, distinguishing reflective from practical belief, and creating forums for cross‑cutting argument. Correcting facts helps little if the belief functions socially; adjusting incentives—offering alternative affiliations, rewarding accuracy—works better. Recognizing that justification, not gullibility, drives much polarization reframes the problem from “people are duped” to “people defend what matters to them.”


Gurus, Counterintuitive Beliefs, and the Limits of Authority

Human deference to experts explains both scientific progress and intellectual cults. You often accept complex doctrines because trusted authorities endorse them, not because you understand their reasoning. Mercier warns that this reliance invites the 'guru effect,' where obscurity itself signals depth and discourages challenge.

Why authority feels safe

Your cognitive economy prevents verifying everything firsthand. Trusting experts—scientists, priests, mechanics—is efficient when their incentives and track records justify it. Counterintuitive knowledge stays mostly reflective: people recite cosmology or theology without those ideas shaping daily causation judgments. This detachment limits harm but can entrench deference to impressive but untested claims.

How obscurity manufactures prestige

When credentials and opacity coincide, audiences interpret confusion as profundity. Dan Sperber coined the “guru effect” to describe how followers co‑create meaning around inscrutable masters (as in Lacan’s linguistic riddles). Obscurity raises evaluation costs and grants interpretive elites prestige for deciphering the undecipherable. Experiments show trivial math or neuroscience jargon inflates perceived rigor—proof that your judgment shortcuts can be gamed by surface cues.

How to defend against false depth

Demand clarity and community validation: can the claim be explained clearly by other qualified experts? Do peers replicate or critique the work seriously? Does obscurity yield predictive or practical payoffs? Intellectual humility means refusing to conflate difficulty with profundity. By maintaining this vigilance, you preserve the benefits of expertise without succumbing to charisma or confusion.


Institutional Feedback and Resilient Truth

The book crescendos to a civic argument: open vigilance flourishes only under conditions that keep feedback alive. When authorities are accountable, errors are self-correcting; when criticism is silenced, gullibility thrives by default. Mass propaganda, from the Nazi regime to modern partisan media, works mainly when feedback mechanisms collapse or audiences lack trusted alternatives.

Limits of mass persuasion

Historical data show propaganda typically reinforces prior alignment rather than engineering belief from scratch. Nazi rallies energized existing sympathizers; Soviet media sustained compliance through fear and consensus signaling, not conviction. Even modern campaign advertising rarely flips voters—it nudges low-salience attitudes. Open vigilance makes wholesale mind control exceedingly difficult.

When gullibility becomes dangerous

The genuine threats arise when decision-makers operate without feedback, as under Mao’s Lysenkoist agriculture, where ideological conformity trumped empirical testing. Institutions that conceal data or punish dissent short-circuit vigilance, allowing absurdities to persist unchallenged. Likewise, corrupt scientific bodies or opaque corporations breed mistrust that later fuels conspiracy thinking.

Rebuilding credibility

To sustain a truth-oriented society, support transparency, reproducibility, and diversity of viewpoints—conditions that let verification flourish. Persuasion’s power comes not from rhetoric but from networks of accountable trust. The final lesson: humanity’s default isn’t gullibility but cooperative sense-making; institutions either amplify or suppress that evolved competence. Guard the feedback loops, and communication keeps working.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.