Idea 1
From Common Sense to Uncommon Understanding
How can you understand societies and systems that are too vast, too tangled, and too unpredictable for intuition alone? In Everything Is Obvious (Once You Know the Answer), Duncan Watts argues that the very thing that guides you through daily life—common sense—becomes misleading when you use it to explain complex social behaviors. Common sense gives you local navigation tools: it tells you how to act in an elevator or at a dinner party, but it fails dramatically when you try to scale those insights into general rules for nations, economies, or cultures.
Watts’s core message is bold: your intuitions, stories, and post-hoc explanations feel reliable only because they fit local contexts. When you move from elevators to entire cities or economies, those same instincts gloss over hidden interactions, sampling biases, cumulative feedbacks, and chance events. To really grasp how societies work—and to make better decisions—Watts invites you to combine systematic data, experiments, and network thinking with humility about what you can and cannot know.
The problem with common sense
Common sense works by analogy. You recall similar situations and map current experience to past ones—an approach that is fast and adaptive. But it is also narrow: what feels obvious depends on local norms and shared assumptions. Those assumptions fail when you move beyond familiar contexts. For example, planners who designed high-modernist housing projects like Chicago’s Robert Taylor Homes relied on intuitive reasoning—symmetry, order, efficiency—without grasping how residents’ social networks functioned. The result was disaster.
Watts shows that intuitive stories are seductive but unscientific. Once an outcome occurs, every explanation feels obvious: you construct circular “of course” narratives after the fact, ignoring alternative paths that could have happened. Lazarsfeld’s famous wartime study—the American Soldier project—revealed findings opposite to expectations; both the correct and incorrect findings seemed equally “obvious” when framed after the fact. That paradox between foresight and hindsight is the trap Watts wants you to escape.
Hidden influences and psychological biases
You think people choose rationally or according to culture, yet studies show how subtle contexts redefine decisions. Defaults determine organ donation rates; fonts affect truth judgments; and even random numbers influence bids in auctions (anchoring). Your environment silently shapes choices more than your motivations do. This means that both everyday explanations (“Germans are different from Austrians”) and political theories that assume stable preferences ignore how small features—like opt-in boxes or unseen cues—steer mass behavior.
Micro meets macro: emergence and feedback
Much of social reality arises from interaction rather than intention. Granovetter’s riot model shows that collective outcomes depend on threshold distributions, not single decisions. A minor change in one person’s tolerance can flip an entire system from calm to chaos. Likewise, cumulative advantage makes lucky breaks look like merit—how a stolen Mona Lisa became famous through historical accidents, not inherent brilliance. The myth of influencers stems from similar errors: what looks like personal genius often reflects conditions ripe for contagion and many small actions aligning at once.
Learning from history and predicting the future
Stories about success and failure usually emerge afterward, shaped by hindsight bias and sampling errors. You remember “important” events but ignore countless similar cases that led nowhere. The temptation is to treat outcomes—wars, elections, viral hits—as inevitable. Watts cautions that historical narratives, though useful, are hypotheses about what might have mattered, not proofs of mechanism. Recognizing this uncertainty prepares you for the next challenge: prediction.
In predicting social phenomena, deterministic models like Laplace’s demon collapse because interactions multiply and randomness matters. You can forecast probabilities for repeatable patterns (e.g., sales cycles) but not unique events (e.g., revolutions). The rational stance is humility: design flexible strategies, test hypotheses, and use probabilities and experiments rather than fixed predictions.
A path toward uncommon sense
The rest of Watts’s argument builds on this foundation. You move from intuition to systematic empiricism: use randomization to isolate causes, measure behavior directly through digital tools, and treat planning as adaptive learning rather than clairvoyance. “Measure and react” replaces “predict and control.” Zara’s fast-fashion model, Yahoo!’s A/B testing, and Google’s search-based trend analysis show that continual measurement beats sophisticated forecasting. Meanwhile, strategies that emphasize flexibility—like Raynor’s scenario planning or Hayek’s decentralized knowledge—help you thrive amid uncertainty.
Ultimately, Watts’s uncommon sense asks you to combine data, experiments, and humility. Admit what you cannot foresee, expose your assumptions to evidence, and embrace feedback loops that let you adjust. Doing so transforms your understanding of how societies work—from a comforting world of causal stories to a complex, probabilistic system that you can learn from but never fully predict.