To Save Everything, Click Here cover

To Save Everything, Click Here

by Evgeny Morozov

To Save Everything, Click Here exposes the risks of ''technological solutionism'', questioning the belief that technology can solve every societal problem. Morozov invites readers to critically evaluate the true impact of technology on our lives, urging a balanced view that recognizes both its benefits and its potential harms.

The Critique of Technological Salvation

You live in an age where nearly every social problem—waste, education, health, or crime—is recast as something a platform, app, or algorithm could fix. Evgeny Morozov’s central argument dismantles this mindset. He warns that faith in technology’s inevitability—the idea that innovation naturally replaces politics and ethics with efficiency—has become a quiet orthodoxy. In this worldview, progress means optimizing, not deliberating; interfacing, not governing.

Morozov labels this ideology solutionism: a belief that every messy human issue has a frictionless design answer, and that ambiguity is wasteful. It pairs with Internet-centrism: treating 'the Internet' as an epochal agent that rewrites history, politics, and culture by its very nature. These twin habits of mind promise transparency, participation, and automation, but often deliver control, superficial engagement, and moral blindness.

A Map of the Book

Across its parts, the book journeys from grand rhetoric to lived practice. It begins by unmasking the myths of Internet-era prophecy—how journalists, CEOs, and pundits invoke 'the Internet' or 'technology wants' as if they were natural laws. Then it scrutinizes how everyday life becomes a petri dish of data: self-tracking bodies, gamified habits, transparent governments, automated memories, and predictive policing. Each domain reveals the same logic—replace civic or moral reasoning with computational fixes that appear self-evident.

When Fixes Replace Questions

Morozov’s objection is not anti-technology; it’s anti-simplification. Solutionism, he argues, short-circuits the process of question formation. It presumes problems are well-defined and measurable, ignoring that many social challenges are contested precisely because we differ over values and goals. A recycling app, for instance, may reduce waste, but it also redefines civic virtue into a points system. Technologies like BinCam that post trash photos to social media teach performance, not citizenship.

The Politics of Inevitability

In linking this to Internet-centrism, Morozov targets the language of inevitability—Kevin Kelly’s “technium wants” or Silicon Valley’s obsession with 'working with the grain' of the Internet. Such rhetoric translates into policy resignation: if the future is preordained by the digital, then democratic deliberation is obstruction. Internet-centrism thus naturalizes corporate power. (Note: Morozov aligns here with scholars like Langdon Winner and James Scott, who warn how technical systems embody political orders.)

From Transparency to Surveillance, From Play to Control

Each later chapter explores how utopian Internet ideals distort civic life. Transparency becomes voyeurism; openness becomes a commercial hegemony of platforms; gamification transforms moral motivation into loyalty programs; personal analytics reframe privacy as a luxury. Beneath the rhetoric of empowerment, datafication produces asymmetries—companies and states observe citizens whose own observation rights lag behind.

A Call for Friction and Reflection

Morozov’s alternative is not unplugging but embedding friction—designing systems that provoke reflection instead of automating virtue. He praises adversarial and erratic designs that make users conscious of their moral agency, such as Usman Haque’s Natural Fuse, where energy use kills or saves plants depending on one’s settings. These projects remind you that good design can ask questions rather than presume answers.

What to Keep in Mind

To read this book is to rehearse democracy: to resist the seduction of seamlessness and to defend the messy space where fallibility, disagreement, and learning survive. Morozov positions himself in the lineage of Illich, Jacobs, and Hirschman—critics who valued local knowledge, friction, and civic imagination over technocratic fixes. He asks you to treat every 'smart' claim as a moral and political question.

Central Lesson

Technological progress is human-made and value-laden. You cannot hand governance to devices without also giving up the debate about what is right. Efficiency is not wisdom. The freedom to err, deliberate, and revise is the condition of a democratic life worth sustaining.


Solutionism and the Loss of Questioning

Solutionism treats technical design as moral destiny. Morozov defines it as the reflex to fix the visible symptom while ignoring why a problem arises. The ideology thrives on the allure of speed, quantification, and automation—three traits that fit engineering culture but flatten political life. When every social issue becomes a design challenge, politics is reduced to user experience optimization.

Examples of 'Fix Everything' Thinking

Projects like BinCam (which photographs household trash for Facebook competition) or sensor-guided kitchens illustrate how monitoring and gamification promise efficiency but erode autonomy. These fixes redefine responsibility as compliance. As Albert Hirschman’s triad warns: they lead to perversity (creating new harms), futility (treating symptoms), and jeopardy (crowding out deeper reforms). MOOCs, for instance, expand access but also justify defunding in-person liberal education.

Why It Persists

Solutionism survives because it flatters multiple audiences. Politicians see lower costs, entrepreneurs see markets, and citizens see convenience. The ideology presents 'fixing' as morally pure and apolitical. But the real question is who defines the problem that needs fixing. A beautifully optimized system can still encode destructive goals.

Alternative Practices

Morozov’s antidote begins with reframing: ask what should remain inefficient. Friction and imperfection sustain democratic life because they force interaction and judgment. Tools should augment rather than automate—help the cook experiment, not surrender judgment to recipes. Design as moral pedagogy, not obedience training. (Note: Here Morozov echoes Ivan Illich’s notion of 'convivial tools'—technologies that extend human potential without dictating ends.)

Guiding Question

When a device promises improvement, pause and ask: What kind of person or polity does this improvement presume? Every fix carries a philosophy of what counts as flourishing.


The Internet and the Illusion of Epochalism

You often hear that 'the Internet changed everything.' Morozov argues that this is more a rhetorical myth than an empirical truth. By treating 'the Internet' as a singular, omnipotent entity, scholars and policymakers obscure its diversity and politics. This mindset—Internet-centrism—acts like a theology: it naturalizes specific designs and disguises corporate or ideological interests as fate.

The Two Pathologies

First, determinism: treating the contingent as inevitable (Eric Schmidt’s 'grain of the Internet'). Second, compression: assuming that disparate platforms—Wikipedia, Amazon, Facebook—share a moral logic of openness and democratization. In both cases, real actors and economic incentives vanish behind 'the Internet' as mythic subject. Morozov compares this to what Bruno Latour called social Pasteurization: invoking a heroic figure to simplify complex assemblages.

Why It Matters

If you attribute political change to 'the Internet,' you neglect institutions, policies, and power. You stop asking whether search engines deserve regulation or what labor practices sustain social networks. Epochalism—declaring every shift revolutionary—leads to shallow governance and historical amnesia. (Historical note: crowdsourcing dates back to the 18th-century Longitude Prize, undermining digital exceptionalism.)

Countering the Myth

Morozov urges precision: discard 'the Internet' in favor of concrete nouns—search algorithms, crowdfunding platforms, or social media firms. Place each in its material and historical context. Ask whose interests are served when change is declared inevitable. Such specificity reintroduces politics into technology talk.

Key Reminder

History didn’t begin with Wi-Fi. Technological revolutions are rarely pure beginnings—they’re continuations disguised by novelty. Seeing the seams restores your capacity for choice.


Transparency, Openness, and the Politics of Sunlight

You are told that more openness means more democracy. Morozov’s chapter on transparency warns you to distrust easy metaphors like 'sunlight as disinfectant.' Transparency, when treated as an absolute rather than an instrument, can wound privacy, distort behavior, and breed performative politics.

When Sunlight Burns

The California donation-mapping site Eightmaps illustrates this risk: public campaign data, once buried in paper files, became interactive maps leading to harassment. Similarly, in Argentina, open-data activism forced municipal data online without political safeguards, generating backlash and data suppression. Information’s meaning changes when you amplify it through searchable interfaces. Visibility is not neutral—it amplifies some facts while erasing context.

Social and Psychological Trade-offs

Political science shows that hyper-visibility can degrade trust and foster 'stealth democracy'—citizens prefer competent governance over continuous exposure. Audit culture turns process into performance: politicians act for metrics, not for deliberation. Transparency, paradoxically, can corrode accountability when it incentivizes showmanship over substance.

A Healthier Model

Instead of fetishizing data dumps, Morozov urges design that preserves practical obscurity: data should be accessible but contextualized, auditable but not instantly viral. Add expiration dates, metadata explaining use, and search friction. Responsible transparency recognizes visibility’s civic consequences.

Moral

Treat transparency as a tool to build trust, not a bonfire to burn complexity. Openness that ignores context risks making democracy more fragile, not more accountable.


Data, Privacy, and the New Marketplace of the Self

Morozov exposes a curious inversion: as citizens are told 'privacy is dead,' the same culture invites them to buy it back. He calls this the Ryanairization of privacy—where every act costs extra. You can pay services to mask your identity or scrub your data, turning a human right into a luxury product.

When Data Becomes an Asset

Startups like Personal.com and Daytum present the self as an economic portfolio. You decide what data to release, trade, or withhold—for a fee. Kevin Kelly quips, 'Privacy is an illusion, but you’ll have as much as you pay for.' This logic creates new inequities: bankers hire reputation firms to hide past scandals while ordinary users face algorithmic profiling they can’t afford to escape.

From Rights to Transactions

By transforming privacy into a tradable good, the market replaces solidarity with individual bargaining. The poor become data sources; the wealthy buy silence. Even well-meaning 'data locker' initiatives commodify autonomy. (Note: Legal theorist Scott Peppet warns that employers demanding fitness-tracker data create coercive 'personal prospectuses.')

Reasserting Civic Privacy

Morozov insists privacy must remain a collective safeguard. Laws—not price tags—should guarantee limits on profiling and data brokerage. He urges policy that equalizes protection: data portability, contextual consent, and prohibitions on discrimination. Civic life depends on zones where you are not an economic category.

Insight

A right you have to buy isn’t a right at all. Turning privacy into an upgrade risks institutionalizing surveillance as the default setting of modern life.


Algorithms, Gatekeepers, and the Myth of Neutrality

Algorithms are now the invisible governors of everyday experience. Morozov dismantles the claim that they merely 'reflect the public will.' In reality, they operationalize corporate and social choices: what to rank, flag, or monetize. He invites you to see them not as mirrors but as engines that shape the very reality they claim to measure.

Revealing the Hidden Hands

Google’s AdSense ban on Guernica magazine or autocomplete defamation lawsuits in France show how algorithmic judgments carry cultural and ethical weight. Brent Payne’s Mechanical Turk experiment demonstrated that minor manipulations could skew search predictions. Yet these systems remain black boxes: their workings shielded as trade secrets.

Why Opacity Is Political

When platforms call their code neutral, they evade responsibility while exercising enormous influence. The danger is not only bias but unaccountability: you cannot appeal an algorithm’s decision without knowing its logic. As algorithms govern news visibility, credit scores, policing, and hiring, secrecy becomes governance without representation.

Paths to Oversight

Morozov proposes audits, redress mechanisms, and transparency-by-design to restore contestability. Treat platforms like utilities that owe explanations and due process. Technologies gain legitimacy not from popularity but from their capacity to be challenged.

Essential Reminder

'We’re just reflecting users' is a convenient fiction. Whoever builds the mirror controls which reflections—and which distortions—enter the public mind.


Gamification and the Economics of Motivation

From points for fitness to digital badges for reading news, gamification makes social engineering playful. Morozov appreciates the creativity but warns: when you incentivize virtue, you risk trivializing it. Games that replace intrinsic motivation with external rewards may boost compliance while weakening long-term commitment.

Manufacturing Engagement

Designers like Jane McGonigal and Gabe Zichermann imagine games that 'fix reality.' But as psychologists show, the overjustification effect converts genuine interest into reward-chasing. Recyclebank’s eco-points or Zamzee’s youth activity badges reshape values into transactions. Chromaroma’s subway gamification turns commuting into a spectacle rather than a spur for civic reform.

When Points Replace Principles

Morozov distinguishes between gamification that creates new capabilities (FoldIt’s crowdsourced protein folding) and gamification that repackages duty as fun. The problem isn’t play itself but moral outsourcing: civic responsibility becomes equivalent to a leaderboard rank. Motivation becomes programmable, not reflective.

The Deeper Concern

A society guided by points trusts programmers more than publics. If games can nudge everything, politics becomes a behavioral economy. Morozov asks you to retain deliberation: to decide which actions deserve incentive and which deserve education, empathy, or solidarity instead.

Reminder

Not everything should be fun. Some virtues require effort precisely because they are civic, not commercial, achievements.


Predictive Policing and Preemption

At the frontier of solutionism lies preemptive governance—systems that promise to foresee and prevent harm before it happens. Morozov examines predictive policing tools like PredPol and the NYPD’s Domain Awareness System to explore what happens when code anticipates crime. Efficiency rises, but liberty contracts.

The New Logic of Governance

Predictive algorithms use past data to forecast hotspots, guiding patrols or triggering alerts. Yet bias in data reproduces bias in policing. Poor neighborhoods are over-monitored, creating self-fulfilling loops. When prevention becomes automation, error correction becomes impossible.

From Surveillance to Stasis

The greater danger is preemption as philosophy: embedding law into code so disobedience becomes impossible. A self-locking car for intoxicated drivers or algorithmic censorship of 'risky' speech may protect safety but erase protest. Law without contest is not justice.

Safeguards for a Preventive Society

Morozov calls for algorithmic audits, transparent standards of probable cause, and public deliberation before deep surveillance becomes default. Prevention must remain contestable. Reversibility is the essence of democratic power.

Provocation

We may one day live in a world without crime—but also without freedom. The cost of perfect prediction is the loss of human unpredictability, which is another name for change.


Adversarial Design and the Rebirth of Reflection

Amid the drive for smooth, optimized life, Morozov celebrates designers who build friction back into technology. Adversarial design, articulated by Carl DiSalvo, transforms devices into instruments of reflection. Instead of making systems invisible, it dramatizes compromise and responsibility.

Examples of Friction by Design

Usman Haque’s Natural Fuse connects plant-based carbon sinks to electrical outlets: draw too much power and your connected plants wither. The system forces moral trade-offs—you must choose between 'selfish' and 'selfless' modes. Similarly, Europe’s 'erratic appliances' or Germany’s Caterpillar extension cable misbehave to signal energy waste, turning sustainability into a dialogue rather than a number.

Why It Matters

These artifacts train citizens, not consumers. They reveal infrastructure’s ethics and teach collective reasoning. In contrast to smart devices that hide complexity, adversarial designs invite you to think about how systems interact. They use discomfort as pedagogy. (Parallel: Jane Jacobs’ defense of messy urban life as the condition for vitality.)

Designing Democracy’s Tools

Morozov turns design into a moral exercise: your technologies should help you confront, not outsource, your responsibilities. Whether in energy, data, or memory, adversarial systems show that resistance and reflection can be designed, too.

Lesson

A 'smart' world shouldn’t mean a frictionless one. Sometimes, good design must grate—because thought begins where automation stops.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.