Careless People cover

Careless People

by Sarah Wynn-williams

Platforms As New Institutions

How do private platforms become de facto public institutions? In this memoir of practice and power, Sarah Wynn-Williams argues that Facebook’s rise was not just a business story; it was institutional formation in real time. She contends that products at planetary scale collapse the gap between code and governance—turning design decisions, growth loops, and PR campaigns into instruments of state-like power. To understand both the promise and the peril, you have to see how growth, culture, diplomacy, and ethics collide inside a company built to move fast yet suddenly responsible for civic order.

You follow Sarah’s arc from a shark-attack survivor in New Zealand to a U.N. lawyer disillusioned with slow treaties, and finally to a policy leader at Facebook (2009–2017). The through-line is a search for leverage: she leaves multilateralism for a platform that already organizes politics, media, and social life. Her conviction hardens early: when politicians and entire communities migrate to Facebook, the people setting its rules sit “at the center of everything.”

From mission to machinery

Inside, the company feels less like an office than a cultural operating system. Perks erase friction so you can live at work. Leadership models—Marne Levine’s battle rhythm, Sheryl Sandberg’s polished public power—convey that stamina outranks seniority. Stock-anchored status stratifies teams. Those norms shape policy choices: “move fast” rewards shipping and apologizing later; rigorous deliberation looks like drag. That culture becomes the default lens on the world, even when the world expects ceremony, law, and process.

Growth becomes governance

The core drama is a structural tension: Javi Olivan’s growth machine versus a thin, overextended policy function. Growth pushes contact importers, People You May Know, and aggressive international expansion—especially after the IPO dip (“running out of road”). Policy warns: at global scale, product decisions create law-like effects. When the product ships to Myanmar or Germany, it collides with sovereignty, speech norms, and safety realities. Without governance “baked in,” misfires become crises.

Diplomacy isn’t decoration

Sarah’s government training exposes a blind spot: statecraft runs on ritual and protocol, not just code. Seats at summits signal respect. Hoodies and slogans like “MOVE FAST AND BREAK THINGS” read as contempt in ministries. The Cartagena seat-switch, open-plan HQ tours for German officials, and bowing expectations in Korea all showcase how cultural signals drive political access. A tech CEO must step into a role closer to a head of state than a founder; refusing that role burns long-term influence.

Ethics under commercial pressure

The book’s moral center is the China question. Mark’s “3-year plan” treats access as an engineering-and-relationships problem. Internal proposals flirt with granting Chinese authorities user-data access, forcing Hong Kong users into new ToS, and building region-block tools and PoPs that expose data to state power. A risk memo even spells it out: employee actions could enable “death, torture and incarceration.” Without pre-agreed red lines, market hunger drives opportunistic drift. (Compare Shoshana Zuboff’s political economy lens in The Age of Surveillance Capitalism.)

Product-policy collisions, from organ donors to the unconnected

Sheryl’s organ-donation push shows how well-meaning features can outrun legality and culture. Mark’s four-word email—“I am overruling you”—signals where authority sits. Internet.org/Free Basics raises the stakes: a limited, unencrypted, permissioned “internet” marketed as humanitarian access. The UN pop-up, celebrity endorsements, and the Connectivity Declaration mask design choices that weaken security and hand gatekeeping power to Facebook and carriers—prompting bans (Chile), scrutiny (Brazil), and a backlash in India.

Democracy, safety, and organizational truth

Elections supercharge the risks. Campaigns weaponize Custom and Lookalike Audiences, dark posts, and Brand Lift. Revenue booms; transparency and guardrails lag. In Myanmar, overlooked basics—Burmese localization, Unicode rendering, and adequate moderators—allow incitement to spread during ethnic cleansing. Meanwhile, ad research celebrates targeting teens when they feel “worthless.” A pattern emerges: optimization without counterweights externalizes harm onto the least protected.

Key Idea

Platforms are now civic infrastructure. If you don’t design, resource, and govern them like public institutions—with protocol, local capacity, and enforceable ethics—you will ship instability at global scale.

Across these chapters, you learn a practical playbook: respect diplomacy as infrastructure; pair growth with built-in governance; set ethical bright lines before hard markets harden you; and align internal incentives with external duty. The lesson is not anti-technology. It is anti-naivete. If you want technology worthy of trust, you must match engineering ambition with constitutional thinking—and do it before the next crisis decides for you.


Culture Sets The Algorithm

Wynn-Williams shows you how environment becomes destiny. Facebook’s perks, pace, and status norms don’t just make life comfortable; they encode a decision function. Everything from on-site meals to laundry is framed as productivity design. The subtext is a 24/7 availability contract, with stamina as the currency of respect. That ethic privileges speed over deliberation—a sensible trade in code, a risky one in geopolitics.

Perks as operating system

The Little Red Book rationalizes services as friction removers. But culturally they anesthetize boundaries: colleagues become your social world; late-night work looks normal. In practice, “move fast” seeps into policy: prototype, ship, backfill process later. That makes product triumphs feel inevitable and policy cautions feel obstructionist. The hierarchy follows stock tenure more than title, producing odd wealth inversions that warp status and candor (assistants in Louboutins, managers on normal pay).

Leadership rhythms as norms

Marne Levine’s predawn triage and Sheryl Sandberg’s polished omnipresence create powerful archetypes. You adapt or get left behind. Their presence also compresses debate windows: when executives decide by email at 2 A.M., complex cross-border risks rarely get full hearings. That rhythm is not trivial; it’s how organ-donor megaphones, India war rooms, and China access plans accelerate past institutional safeguards.

The Lean In paradox

You’d expect the author of Lean In to preside over a model workplace for women. The reality is a double bind. Public storytelling celebrates women’s ambition; internal incentives reward “invisible” motherhood. Sarah sends Sandberg talking points from a delivery room, later hears that a baby-care emergency is “bad optics.” Events laud empowerment while travel loads, leave policies, and meeting norms remain unforgiving—privileging the childless or the wealthy who can outsource care. (Compare Anne-Marie Slaughter’s “Why Women Still Can’t Have It All.”)

The pedestal economy

Sandberg’s office functions like a court. Aides gain proximity, gifts, and plum roles if they mirror the boss’s needs. Dissent downgrades your standing. That intimacy-for-loyalty trade hollows oversight. When Sarah reports Joel Kaplan’s intrusions and inappropriate behavior, HR maneuvers narrow the complaint, question her performance, and freeze headcount—then terminate her without a farewell. The message travels: protect the brand, punish the breaker of silence.

Why culture equals policy

A culture that worships speed, wealth, and proximity will pick growth over governance in close calls. It will also rationalize ethically dubious revenue—like targeting teens when they’re “stressed” or “worthless”—because the machine rewards line items, not long-term trust. If you lead, you have to add counterweights: independent escalation paths, protections for whistleblowers, explicit slack for deliberation, and incentives that credit harm avoided as much as features shipped.

Key Idea

Your culture is the algorithm that selects which risks you see, which warnings you ignore, and which people you silence. Change that algorithm, or it will choose for you.

Practically, this means resourcing policy as a first-class function; formalizing cross-functional scoping for high-risk launches; aligning HR and public commitments; and making “protocol literacy” part of leadership onboarding. Otherwise, what looks like efficiency becomes a risk multiplier at planetary scale.


Growth Versus Governance

If you build for scale, your hardest problem is balancing expansion with legitimacy. Facebook’s version pits Javi Olivan’s growth playbook—contact importers, virality loops, and international blitzes—against a small policy team tasked with preventing global blowback. The tension peaks after the IPO stumble (“running out of road”), when growth becomes existential and international markets look like oxygen.

What growth believes

Growth treats the world like greenfield: move first, stake out network effects, iterate locally. Internet access, UX speed, and user loops are nonnegotiable. In this frame, resistance from regulators is a friction problem to be engineered around—through new features, workarounds, or political campaigns. Success looks like penetration curves and daily actives up-and-to-the-right.

What policy fears

Policy sees durable risks: privacy violations in Germany; sovereign sensitivities in Brazil; propaganda dynamics in Myanmar; net neutrality everywhere. Global features ship instantly, but laws and cultures don’t refactor on release day. Sarah’s trips make it vivid: a junta can block Facebook in Myanmar; prosecutors can detain staff in Brazil; a telecom deal can spark national protests. Trust and legitimacy become the true growth ceiling.

How collisions happen

The organ-donation push illustrates the micro version: Sheryl’s mission story meets country-by-country legal complexity. Engineers want a megaphone; policy says “do the scaffolding first.” Mark’s email—“I am overruling you”—decides in favor of speed. Scale that logic to Internet.org and you get a macro-collision: a curated, text-only, unencrypted pseudo-internet pitched as humanitarian access. The optics and the product are misaligned; the backlash is inevitable.

Missing institutions

The problem isn’t bad intent; it’s absent process. Facebook privileges engineering instincts—build, test, learn—over constitutional ones—deliberate, legitimize, constrain. There’s no routine that forces joint problem scoping with local partners, no default for red-teaming abuse vectors, no clear escalation forum where policy can stop the train. Leadership attention substitutes for governance, which works until it doesn’t.

A better design

Treat high-risk launches like public works: conduct cultural and legal EIRs (environmental impact reviews), co-own milestones across product, policy, legal, and comms, and set pre-agreed kill criteria. Build early warning via civil society channels; localize documentation and reporting in relevant languages; create a standing “Protocol & Policy” table for CEO trips and sensitive markets. (Note: this mirrors the “safety by design” movement in trust & safety and the “responsible AI” pre-mortem practice.)

Key Idea

If governance isn’t built into growth, crisis will be. Speed wins a quarter; legitimacy sustains a decade.

For you as a builder or advisor: insist on cross-functional charters and decision rights before expansion; measure “trust earned” alongside MAU; and train leaders in statecraft, not just storytelling. Otherwise, you will keep discovering that your product ships laws you never meant to write.


Connectivity Or Gatekeeping?

Internet.org/Free Basics is the book’s canonical case of good intentions engineered into bad governance. Onstage, it looked irresistible: Mark Zuckerberg at the U.N., a glossy “Connectivity Declaration,” celebrity endorsements (Shakira, Bono, Stephen Hawking, Charlize Theron), and a promise that access would lift people out of poverty. Offstage, product choices told a different story: a curated, unencrypted slice of the internet chosen by Facebook and carriers, deployed to the least digitally literate users.

The theater and the product

The company built an innovation pop-up outside the U.N. and bought full-page ads to frame Free Basics as humanitarian. But the app did not support encryption or 2FA, lacked many moderation and safety capabilities, and operated as a permissioned garden. It made censorship, surveillance, and manipulation easier precisely where users were least protected. Even the name “Internet.org” implied neutrality and nonprofit status it did not have—so misleading that Brazil’s Ministry of Justice pushed back on the brand.

Civil society saw through it

Sixty-seven digital rights groups warned early: this violates net neutrality and entrenches platform gatekeeping. The critique wasn’t pedantic. When a platform and telco decide which sites are free, they pick winners and losers, shape knowledge flows, and tilt local markets. That’s a sovereign decision masquerading as philanthropy. Internally, Sarah and colleagues fought the branding and the feature set; leadership compromised on a rename—Free Basics—far too late to fix the DNA.

Local politics aren’t a deployment detail

Chile banned zero-rating. Brazil’s Dilma Rousseff prioritized infrastructure over free apps. In Indonesia and Myanmar, deal-making ran into the thicket of telecom incentives and state sensitivities. Each market surfaced the same flaw: the project centered growth optics over rights-centered design. The supposed beneficiary—the first-time user—got the most fragile experience.

Design embeds values

Free Basics proves that connectivity isn’t morally neutral. Encryption support, content eligibility, moderation capacity, and data practices are political choices. If you don’t ship security and rights protections by default, you ship power asymmetries. (Note: this mirrors critiques of “walled garden” strategies from early mobile carriers and the debate over zero-rating in the U.S. and India.)

Build the scaffolding first

A better model would have started with open standards, neutral treatment of traffic, independent oversight, and country-by-country civil society compacts. You pair product rollouts with localized safety resources, transparent partner criteria, and grievance mechanisms. You don’t market a subset as the whole. And you tell the truth about tradeoffs before regulators discover them.

Key Idea

When you sell “access,” you’re selling a constitution. If you don’t encode neutrality, security, and transparency, philanthropy becomes gatekeeping at scale.

For your own work: interrogate whether your “help” reduces autonomy, invites surveillance, or substitutes PR for capability. If the splashiest stage in the world is needed to sell it, start by asking what the product itself can’t say.


India’s Street-Fighter Playbook

India is where platform persuasion turned into a policy war. After early pushback to Free Basics, Mark urged “street fighter tactics”: mobilize users, weaponize platform tools, and pressure regulators. The India Action Plan even wrote it down—“galvanize actual (or at least the appearance of) public support.” That parenthetical reveals the moral posture: optimize optics, whether or not consensus exists.

From lobbying to mobilization

Facebook lit up India with billboards, SMS blasts, and targeted ads reaching over half of adults. The most aggressive move: an in-product pop-up prompting users to send a form letter to TRAI, India’s telecom regulator, endorsing Free Basics. This was Uber’s playbook, scaled by a platform with far more reach and richer data. Suddenly a private company could manufacture a constituency and flood a democratic channel with templated speech.

Virality meets bureaucracy

The campaign generated 16.9 million submissions—until an opt-out click or glitch left only 1.4 million logged. Panic ensued; staff delivered a flash drive to preserve the showing. But paperwork wasn’t the real hurdle. TRAI saw through the tactic and, on February 8, 2016, banned zero-rating altogether. Legality diverged from legitimacy; algorithmic mobilization couldn’t substitute for public interest.

Why it backfired

When a platform judges itself by measurable engagement, it mistakes pressure for persuasion. Policymakers discount choreographed mailstorms, civil society resists astroturf, and the public bristles at corporate intrusion into democratic processes. The effort also exposed asymmetry: Facebook could target, prompt, and tally at a scale unavailable to its critics—exactly the power imbalance net neutrality aims to check.

Lessons for platform politics

If you embed political megaphones into product, you need guardrails: transparency labels, public archives, rate limits on regulatory outreach, and strict prohibitions on manufactured consent. Align messaging with genuine coalition-building, not ad spend. And build willingness to lose a policy fight rather than win it by eroding process you’ll later depend on.

A civic design stance

The India episode is a case for “civic-respectful product design”: tools that amplify users’ voices must not become instruments for corporate will. Archive all political prompts; disclose targeting criteria; allow counter-speech visibility; and ensure regulators can audit aggregate effects. (Note: this anticipates post-2016 reforms like ad libraries and reduced political microtargeting.)

Key Idea

Mobilization without legitimacy is manipulation. In democracies, engineered consent corrodes both the cause and the company.

For your roadmap: treat policy audiences as partners, not targets. Success looks like co-designed frameworks, not mass-submitted form letters. When the temptation to “street fight” arrives, it’s usually a sign your product needs a constitutional fix, not a louder bullhorn.


Authoritarian Bargains In China

China concentrates the book’s ethical scrutiny. Mark’s “China 3-year plan” framed access as the last great frontier. The internal strategy moved from courtship to technical delivery: partner with Hony Capital (code name Jupiter), stand up local infrastructure, and build tooling to satisfy censorship and surveillance expectations. It wasn’t abstract; engineers scoped content moderation interfaces, facial recognition, and region-block switches—code paths to compliance.

The technical stack of compromise

Points of Presence (PoPs) inside China would speed experience but risked caching non-Chinese data along cross-border routes, exposing foreigners’ information to Chinese legal or extra-legal access. Moderation consoles would empower a local partner to fulfill takedown and data requests at state speed. Internal proposals aired extreme ideas: data access for authorities, forced ToS changes for Hong Kong, emergency switches for sensitive anniversaries. A risk memo warned bluntly that employee actions could facilitate “death, torture and incarceration.”

The missing constitution

The most troubling flaw wasn’t the ambition—it was the absence of public, principled limits. Leadership pushed on “how” to enter, not “whether” under nonnegotiable rights. In parallel, the CEO began personally intervening in content-policy decisions for geopolitical reasons, hollowing Community Standards into ad-hoc fiat. That combination—expansionist zeal plus leader-overrides—shrank the space for principled refusal.

Why red lines matter

Authoritarian markets invite deals that instrumentalize your capabilities against your users. If you don’t pre-commit to red lines—no data localization for repression, no partner-run censorship via your tools, no targeted identity sharing—commercial gravity will pull you toward complicity. You can’t improvise ethics at the negotiating table; you’ll be the weaker party, eager for scale.

Designing for refusal

Treat “no” as a feature. Build architectures that minimize sensitive data, resist compelled access, and make surveillance economically and technically costly. Publish a market-entry bill of rights, empower independent oversight to veto, and accept that some markets are off-limits without regime change. (Note: this echoes Microsoft’s stance on certain Russian requests and Apple’s contrasting concessions in China—illustrating the spread of choices across Big Tech.)

Key Idea

When you negotiate with the state, your code is your conscience. If your architecture bends to repression, your values will follow.

For you: pre-commit publicly to non-negotiables; align incentives so executives are rewarded for principled exits; and run tabletop exercises that test your willingness to walk. Scale without sovereignty is just surface area for capture.


Democracy, Safety, And Harm

When optimization meets politics and vulnerable users, the costs become human. The 2016 U.S. election, Myanmar’s atrocities, and teen emotional targeting map one pattern: powerful tools, weak guardrails, and leadership incentives tuned to revenue or PR over rights. If you build systems like these, you inherit obligations that look like public duty.

Elections as product tests

The Trump campaign operationalized Facebook’s ad stack: Project Alamo’s database fed Custom Audiences; Lookalikes expanded reach; tens of thousands of variants ran constant A/B tests; Brand Lift surveys guided iteration. The same machinery powered dark posts targeted at Black voters, young women, and Bernie supporters to suppress turnout—speech that public couldn’t see or study. Internally, leaders split: some admired the craft; others feared the civic fallout. Revenue loved it.

Myanmar: neglect as complicity

In Myanmar, Facebook was the internet for millions. Yet community standards weren’t translated; reporting buttons broke; Burmese script rendering failed due to Unicode issues; and two remote contractors “handled” moderation. Civil society pleaded for help as hate campaigns escalated; hires were blocked; engineering focus drifted elsewhere. The U.N. later documented Facebook’s role in inciting violence during ethnic cleansing. Inaction in a high-risk context is an action with consequences.

Monetizing vulnerability

An Australian deck flaunted the ability to target 13–17 year olds when they felt “worthless,” “stressed,” or insecure—correlating deletes, posts, and interactions with mood signals to sell attention. Leadership initially issued a denial—“we don’t target based on emotional state”—despite internal knowledge to the contrary, and a junior researcher took the fall. The business logic was clear: this is what puts “money in all our pockets.” The ethical logic was absent.

Guardrails that work

You can design against these harms. For elections: require real-time public ad libraries, restrict political microtargeting to broad segments, and guarantee researcher access to aggregate reach and spend. For fragile contexts: localize standards, fund on-the-ground moderation, partner with civil society, and pre-define escalation triggers that pause growth features. For youth: ban targeting tied to inferred emotional distress, compartmentalize teen data, and invite independent audits under enforceable oversight.

Measuring the right outcomes

Shift KPIs from pure engagement and revenue to a balanced scorecard that includes harm reduction, time-to-mitigation, and trust signals from regulators and NGOs. Tie executive compensation to these metrics. (Note: think of it as a safety case approach, as used in aviation and nuclear sectors.)

Key Idea

Optimization without constraints will always find the most profitable boundary to cross. Build the constraints, or your product will test them on people you can’t afford to harm.

The deepest lesson is institutional: platforms that function like public utilities must be governed like them—by rules, independent oversight, and resources calibrated to the stakes. Without that, technical excellence becomes a delivery vehicle for social damage.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.