The Silo Effect cover

The Silo Effect

by Gillian Tett

The Silo Effect explores the pitfalls of compartmentalized thinking in organizations and society. Gillian Tett reveals how breaking down silos unleashes creativity, improves communication, and prevents costly oversights, offering insightful strategies to foster collaboration and innovation.

How Silos Shape—and Distort—Modern Systems

Why do organizations that begin with creativity, curiosity, and shared purpose so often end up divided into rigid units that can’t see the bigger picture? This book argues that silos are not just bureaucratic accidents—they are cultural artifacts. From city halls and banks to hospitals, universities, and tech firms, each institution develops hidden classifications, rituals, and incentives that shape what its people see as natural. Over time, those mental and structural walls breed blindness. But anthropology, data integration, and deliberate design can expose and reshape them.

Drawing on cases from Sony, UBS, New York City Hall, Facebook, Cleveland Clinic, BlueMountain Capital, and the economics profession, the book connects human habitus and institutional design. It shows that silos endure because they feel inevitable, yet when you question the categories, you unlock new forms of collaboration and awareness.

The Anthropology of Silos

Pierre Bourdieu’s fieldwork in rural France provides the book’s conceptual backbone. His concept of habitus—the unspoken patterns individuals absorb from their environment—explains how divisions form naturally without explicit rules. Just as villagers sorted themselves into dancers and non-dancers at a Christmas ball, office workers divide into departments, professions, and ranks. What seems efficient organization is often social reproduction. The key anthropological lesson: “every established order tends to make its arbitrary system seem natural.” You must learn to see your own categories as cultural artifacts.

This anthropological lens explains not only how companies like Sony framed “innovation” or how banks defined “client business,” but why smart professionals in all fields become blind to obvious risks. To see clearly again, you must become an insider–outsider: embedded enough to understand context, detached enough to notice absurdities.

When Categories Collide: Failure and Reinvention

Sony’s “octopus pots,” UBS’s fragmented risk system, and economists’ blindness to “shadow banking” reveal what happens when habitus meets high stakes. Sony built dozens of proprietary audio players because its leaders rewarded local P&Ls instead of shared platforms. UBS lost billions because risk officers treated asset classes and geographies as separate universes. Economists, entranced by elegant models, missed the off-balance-sheet credit system that would soon implode. In each story, the failure is upstream—an error of classification.

Yet the same principle also powers recovery. Once you name what was invisible, as when Paul McCulley coined “shadow banking” or Paul Tucker traced M4 growth through “Other Financial Corporations,” you make a hidden system legible. Naming and mapping create the cognitive bridge needed for reform.

Case Studies in Silo-Busting

Some pioneers fight silos through data integration, design, or daring cross-boundary moves. Mike Flowers’s skunkworks team at New York City Hall merged building complaints, tax records, and field inspector notes into predictive models that quadrupled housing inspection efficiency. Brett Goldstein in Chicago used data and gang movement maps to anticipate murders. Toby Cosgrove at Cleveland Clinic flipped the hospital’s structure from doctor-centered to patient-centered “institutes.” BlueMountain’s Andrew Feldstein exploited and then dismantled the same kind of buckets that crippled big banks. Each innovator began by questioning what others treated as obvious—“how we’ve always done things.”

The results were transformative: smarter predictions, higher patient satisfaction, lower operational waste, or new trading profits. But these successes share one fragility: without incentive alignment and political protection, the reforms revert. When Weis left Chicago or when Sony’s cross-unit mandates faced cultural pushback, old silos re-emerged.

Designing for Connection

Where traditional institutions tangle themselves in hierarchy, Facebook shows what it means to design culture around collisions. Mandatory Bootcamp immerses all new hires—managers included—in shared code and vocabulary. Hackathons, Hackamonths, and open architecture deliberately mix teams that would otherwise ossify into microtribes. Cleveland Clinic’s redesigned corridors and empathy training serve a parallel purpose: to make humans bump, talk, and share perspective. Architecture and ritual become silent teachers of connection.

The Human and Political Limits

No reform is purely technical. Goldstein’s maps provoked racial politics; Cosgrove’s redesign provoked professional turf wars. Incentives matter: BlueMountain and Cleveland Clinic succeeded by aligning pay across functions, while Sony and UBS stumbled because rewards stayed local. Technology alone is not salvation—you need translators, storytellers, and bridge figures who can navigate the politics of reform.

Core argument

To dismantle silos, you must combine three ingredients: anthropological awareness (to see hidden classifications), data and design (to reconnect the fragments), and incentive alignment (to make new habits durable). Culture builds the walls; curiosity and structure must take them down.

Ultimately, this book asks you to see beyond the surface of “who reports to whom.” Silos are psychological architectures disguised as organizational charts. By applying the anthropologist’s curiosity, the engineer’s integrative logic, and the designer’s empathy, you can change not only how your institution functions—but how it sees the world it serves.


Seeing Categories as Culture

The first step in breaking silos is realizing they begin in the mind. Drawing from Pierre Bourdieu’s anthropology, the book teaches that your cultural environment silently encodes classifications of status, taste, and legitimacy—what Bourdieu called habitus. You don’t consciously choose these frames; you inherit them and assume they are common sense. That blindness shapes how institutions carve their internal worlds.

Invisible Divisions

At a dance in Béarn, villagers divided into dancers and non-dancers without instruction. The split seemed “natural” because of embedded norms. Organizations behave similarly: engineers versus marketers, doctors versus nurses, traders versus risk officers. Once the habitus encodes hierarchy, each group defends its domain and translates information into its dialect, preventing system-wide learning.

Social Silence and Power

Bourdieu called the unspoken consensus that maintains these divisions doxa—the arena of things “everyone knows” but no one questions. Cultural capital—dress, jargon, credentials—enforces belonging. You feel comfortable within your “tribe” and awkward outside it, just as hospital departments or city agencies feel justified in guarding their data or methods. That silence protects power but hides risk and inefficiency.

The Insider–Outsider Method

Bourdieu’s most actionable idea is to become an insider–outsider. You try to occupy a position both within the system and marginal to it—enough to see what insiders take for granted. He used Algerian students to study French villagers; you might embed analysts in field teams or invite outsiders into project reviews. The question “what are we not allowed to talk about?” is often the spark for discovery.

Practical takeaway

Treat your categories—products, clients, departments—as hypotheses, not truths. By surfacing tacit norms and testing whether they still fit, you begin to change your habitus and your institution’s perception of its own world.

Once you grasp that classification is cultural, not natural, you can redesign it. That mindset lays the groundwork for every practical story that follows: from Bloomberg’s data hack to Cleveland Clinic’s institutional reframe.


Data and Fieldwork: Bloomberg’s Skunkworks

Silo-busting isn’t only intellectual—it’s operational. Mike Flowers and John Feinblatt’s team in New York City Hall show how ethnography plus data synthesis can dissolve bureaucratic blindness without a cultural revolution. After a deadly 2011 Bronx fire, they asked whether housing fires could be predicted. Thousands of complaints, hundreds of databases, and only 200 inspectors made the task seem absurd—until fieldwork met joined-up data.

Riding Along, Asking Naïve Questions

Flowers placed young data scientists with fire inspectors, asking: “What do you look for when you suspect illegal housing conversions?” Those ride-alongs uncovered subtle signals—old buildings, mortgage delinquencies, vermin complaints, recent brick deliveries—that were invisible in spreadsheets. Anthropology again: go see the field before building a model.

Joining the Fragments

Using PLUTO (Primary Land Use Tax Lot Output) as a backbone, the team linked 311 complaints, lien data, fire records, and building permits into one predictive system. Inspectors now prioritized addresses by risk; hit rates improved nearly fourfold. The takeaway: connect what was scattered across bureaucracies, and you change what’s visible and actionable.

Replication and Impact

The same model later exposed pharmacy fraud and cigarette tax evasion. Each success came from curiosity, small budgets, and rapid prototyping. Resistance was overcome not by authority but by evidence—quick wins that made skeptics allies. This pattern reappears in every successful reform in the book: insight begins at the boundary of human observation and relational data.

Key lesson

Don’t wait for a perfect reorganization. Build bridges across datasets, sit with practitioners, and produce concrete results. Culture shifts when evidence accumulates faster than excuses.

The Bloomberg story proves that information architecture can be more revolutionary than new laws. It teaches how small, interdisciplinary groups can make silos transparent by rediscovering the field and connecting its data trails.


When Classifications Fail: Sony and UBS

If Bloomberg’s skunkworks represents what happens when curiosity triumphs, Sony and UBS show what unfolds when classification ossifies. Both organizations were filled with brilliant professionals who confused their categories for truth—and paid the price.

Sony’s Octopus Pots

From the cofounders’ tinkering culture of the 1950s grew a rigid “company system” by the 1990s. Splitting into profit-accountable units created internal markets that rewarded protectionism. Howard Stringer’s “octopus pot” metaphor captured teams trapped defending their own pots—Walkman versus Vaio versus Network Walkman—while Apple unified design, software, and content under one P&L. Sony’s failure wasn’t a lack of intelligence but of integration; its taxonomies trapped innovation.

UBS and the Illusion of Prudence

UBS’s implosion during the financial crisis stemmed from a similar blindness. The bank categorized “super senior” CDO tranches as short-term market inventory instead of long-term exposure. Risk officers, scattered across thousands of roles, never saw the global net position. When markets froze, what looked like diversified prudence was concentrated hazard. The Swiss regulator would later call it “institutional blindness.”

Underlying moral

Silos turn intelligence into ignorance when each unit’s framing hides how the parts relate. Your biggest risks are often the ones that live in other people’s definitions.

Both stories confirm the anthropological rule: what’s invisible to a culture is what it takes for granted. To stay healthy, an organization must periodically reclassify its world—and merge its data, incentives, and language around reality, not tradition.


Flipping the Lens: From Patients to Policing

Sometimes breaking silos means rewriting the very questions you ask. Toby Cosgrove at Cleveland Clinic and Brett Goldstein in Chicago each flipped a legacy taxonomy: one in medicine, one in policing. Both prove that changing how you see a problem transforms how you can act.

Cosgrove’s Patient Experience Revolution

Prompted by a student’s question, Cosgrove realized that world-class technical excellence did not equal empathy. He restructured the 43,000-person Cleveland Clinic around “institutes” defined by patient problems—Heart & Vascular, Neurological, Cancer—rather than traditional specialties. Surgeons, physicians, and radiologists worked side by side. Empathy training, redesigned spaces, and a Chief Experience Officer completed the turnaround. Salaried compensation helped align incentives, achieving top satisfaction rankings without hurting outcomes.

Goldstein’s Murder Maps

Brett Goldstein applied the same reclassification to urban violence. Joining Chicago PD after a tech career, he centralized data that had been scattered across divisions and built geo-temporal models predicting shootings based on gang movement and temperature shifts. Twice-daily calls and rapid-response units converted analytics into action, cutting homicides while the system ran. When politics shuttered the program, homicide rates jumped again—a stark reminder that structural insight needs political air cover to survive.

Unified insight

Cosgrove and Goldstein both replaced legacy classifications (“by specialty” or “by district”) with frames closer to lived experience (“by illness” or “by behavior over space and time”). That mental redesign turned siloed professions into adaptive systems.

Whenever you feel trapped by institutional categories, ask: what would happen if we reclassified the problem from our perspective to the user’s or citizen’s? That question alone often exposes the path out of the silo.


Designing Culture and Collisions

Facebook’s anti-silo design shows that connectivity can be architected. Growth inevitably breeds specialization; preventing that from hardening into turf wars takes deliberate structure, rituals, and physical design. Facebook’s six-week Bootcamp forces shared technical language. Hackamonth rotations and all-night hackathons engineer unpredictable collisions that generate social glue and new ideas. The open campus layout—with glass walls, catwalks, and collective spaces—turns architecture into a cultural operating system.

The Logic of Collisions

Facebook’s leaders treat inefficiency as a feature: rotating people is expensive, but it prevents stagnation. Cultural designers like Andrew Bosworth and Jocelyn Goldfein model vulnerability and transparency through internal posts, making online and offline spaces mutually reinforcing. Similarly, Cleveland Clinic’s redesigned corridors and public Red Coats embody openness in physical form. Design is the new anthropology—it tells people what behavior counts here.

Practical Design Principles

  • Create shared onboarding and language (Bootcamp).
  • Rotate people to refresh mental maps (Hackamonth).
  • Design buildings and rituals that make people bump into each other.
  • Accept short-term inefficiency as the price of long-term creativity.

Design imperative

Culture doesn’t emerge by accident. You must design its rituals as deliberately as you design your algorithms or buildings. Architecture and social engineering are complementary anti-silo tools.

If you lead any team, think of yourself as its cultural architect. You can’t remove every wall—but you can make them translucent and easy to cross.


Incentives, Politics, and Human Limits

Every successful reform in this book balanced technology, culture, and incentives. Every failure neglected at least one. Cultural change collapses when incentives pull people back to old definitions or when political forces treat data as threat.

Align the Rewards

Cleveland Clinic’s salaried doctors and BlueMountain’s team-based bonuses fostered collaboration. Sony’s and UBS’s P&L silos and UBS’s fragmented risk mandates reinforced protectionism. Incentive design is the hidden architecture of cooperation: if you reward local gains, you guarantee global failure.

Manage the Politics

Goldstein’s murder maps fell victim to political and racial controversy. His analytics worked, but public perception of bias created backlash. In any institution, data becomes political the moment it changes power distribution. You must prepare your narrative and allies before your experiment threatens vested interests.

Recognize the Human Limit

Technology centralizes information but not imagination. Ride-alongs, empathy trainings, and interdisciplinary briefings remind you that data gains meaning through human judgment. The most effective organizations institutionalize human translators—people fluent in multiple professional languages—so numbers lead to shared understanding rather than new silos.

Final reminder

Breaking silos is a social act. Without empathy, incentive redesign, and political navigation, even the best analytics or reorganizations revert to the old order.

The enduring lesson: information alone doesn’t integrate systems. People do. Structure, culture, and politics move together—or not at all.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.