Who Can You Trust cover

Who Can You Trust

by Rachel Botsman

Who Can You Trust explores the evolution of trust from traditional institutions to peer networks, driven by technology. Rachel Botsman reveals how platforms like Airbnb and Uber leverage trust, how blockchain could revolutionize transparency, and why institutional trust is waning in the digital age.

The Rise of Distributed Trust

How do you decide whom to trust in a world where strangers handle your money, data, and even your safety? In Who Can You Trust? Rachel Botsman argues that humanity is living through a profound shift—the Third Trust Revolution—in which trust moves from local relationships and institutions to distributed networks and platforms. You now rely on invisible systems of reputation, algorithms, and digital ratings rather than face-to-face judgment or authoritative gatekeepers.

Botsman’s central idea is that trust is not dying; it is changing form. You have moved from local trust (people you know), through institutional trust (banks, brands, governments), to distributed trust (systems that let you interact safely with strangers and machines). To understand that change, you must see how technology reduces unknowns—the essence of her definition: trust is a confident relationship with the unknown.

The erosion of institutional trust

Botsman traces how scandals—from Lehman Brothers and Volkswagen to BP and FIFA—eroded public belief in institutions. These failures created distrust shaped by inequality (some never face consequences), the twilight of elites (the digital flattening of authority), and echo chambers (algorithmic bubbles reinforcing bias). You began to rely more on peers, collaborative journalism, and platforms that felt more accountable.

This backdrop made distributed trust possible. Once you stopped automatically trusting governments and experts, you began to test and verify through networks—the Edelman Trust Barometer captures this shift from vertical to horizontal trust. The crisis of authority cleared the space for new mechanisms: ratings, algorithmic moderation, and identity protocols.

The architecture of distributed trust

Platforms such as Alibaba and Airbnb offer vivid case studies. Jack Ma’s TrustPass and Alipay turned suspicion toward strangers into routine economic confidence by introducing escrow, verified identities, and transparent reputation. You trust other drivers on BlaBlaCar or hosts on Airbnb not because regulators certify them, but because layered signals—reviews, badges, guarantees—reduce uncertainty. This layered process forms Botsman’s Trust Stack: first trust the idea, then the platform, then the person.

Every new behavior involves a trust leap—that moment you act beyond certainty: staying in a stranger’s home, paying with invisible money, or using an autonomous car. Trust leaps are not irrational: they happen when systems make the unknown familiar enough. The push away from centralized intermediaries toward reputation-driven networks underpins your daily interactions.

From signals to systems

Botsman warns that trust is fragile. You are wired to make snap judgments based on appearance and social signals—a smiling face, professional demeanor, or polished profile—yet such cues (as shown by Jon Freeman’s research) often have little connection with truth. In distributed systems, trust signals can be fabricated. Technologies like Trooly and UrbanSitter illustrate how better design can combine verified credentials with behavior-based scoring to minimize deception. You must learn to ask, as Onora O’Neill advises: “trustworthy for what?”

Reputation as infrastructure

Reputation becomes the new currency—used everywhere from ride-sharing to the darknet. Even Silk Road’s illegal market relied on ratings, escrow, and feedback to maintain order where law could not reach. High reputation lowered risk and raised prices, proving that distributed trust can self-regulate—but also showing its vulnerabilities when fake reviews or exit scams undermine credibility. Botsman underscores that reputation systems are double-edged: they can democratize trust, yet also amplify inequality or surveillance when controlled by powerful actors.

The frontier of algorithmic trust

Objects and algorithms now perform trust work once done by humans. Blockchain removes intermediaries through cryptographic proof—yet still centralizes power in mining pools or developers (as the DAO hack demonstrated). Robots like “Bert C” show that emotional design influences human trust more than technical competence, while failures like Microsoft’s Tay expose how machine learning can magnify moral failure. These examples remind you that code can’t replace ethics; distributed systems still need responsibility, oversight, and moral architecture.

The dangers of total reputation

China’s Social Credit System and Ant Financial’s Sesame Credit embody the extreme form of distributed trust—where data-driven scoring becomes political governance. Your friends’ habits, purchases, and online behaviors can affect access to housing, travel, or work. Rewards and punishments replace freedom of judgment. Botsman cautions that when reputation shifts from voluntary evaluation to enforced obedience, trust turns into control.

Designing for repair

In closing, Botsman argues for people-first trust design. Projects like Tala in Kenya demonstrate how data can empower rather than exploit—the system builds trust by measuring behavior with empathy and transparency. She urges accountability frameworks (“rate the raters”), algorithmic audits, appeal systems, and cross-platform ethics. Distributed trust works only when shaped by human values, not surveillance logic. Technology can heal broken trust—but only if it remains accountable to the people it serves.

Core message

Trust is migrating—from institutions to platforms, from authority to reputation, from physical presence to digital proof. The challenge isn’t whether you can trust technologies or strangers—it’s whether you can build systems where accountability, empathy, and transparency survive the transition.

Botsman’s book is therefore both diagnosis and manual: a call to understand trust’s new geography and to shape it deliberately before it shapes you.


Building and Losing Institutional Faith

Before distributed trust could thrive, institutional trust had to falter. Botsman shows how repeated crises shattered collective confidence: financial collapses, corporate scandals and opaque governance left people looking elsewhere for security. When Lehman Brothers collapsed on her wedding day, Botsman saw the symbolism—a society realizing its old pillars were unreliable.

The cascade of betrayal

The Tuskegee medical scandal, BP’s Deepwater Horizon explosion, FIFA’s corruption, and the Panama Papers exposed systemic moral hazard. Banks and elites escaped penalties, governments spun disasters, and journalists uncovered truths only through collaboration. These crises accumulated until you began expecting failure rather than stability from institutions.

Why distrust spreads

Distrust is contagious. Once you lose faith in one institution, you question others. Botsman interprets this as a social rebalancing: your compass for trust now points to peers and platforms rather than governments. The “twilight of elites,” described also by Christopher Lasch decades earlier, becomes reality through technological flattening—where social proof on a feed triumphs over credentialed expertise.

From betrayal to alternative trust

The Edelman Trust Barometer shows steady decline in faith in media, business, and politics. Out of this vacuum come blockchain advocates, peer-to-peer markets and social networks promising direct accountability. But Botsman warns that the cure isn’t simple substitution; institutions still matter. The new networks will need their own transparency and moral guardrails or they will replicate the same failures in disguise.

Essential lesson

You are witnessing a trust vacuum—old systems lost legitimacy before new ones developed ethical maturity. The next era depends on redesigning accountability, not merely replacing actors.

Institutional failure doesn’t mean trust dies; it simply migrates. The responsibility now is to ensure new digital institutions aren’t hiding the same fragilities behind algorithmic walls.


The Mechanics of the Trust Stack

Every innovation begins as a leap into ambiguity. Botsman’s concept of the Trust Stack explains how people climb gradually from idea to adoption: first trusting the idea, then the platform, then the individual or machine delivering the experience. Miss any rung, and confidence collapses.

Trusting the idea

To trust a new concept, it must feel “strangely familiar.” Using the California Roll metaphor, Botsman shows how entrepreneurs present novelty wrapped in familiarity—like BlaBlaCar framing ride-sharing as planned travel with shared costs rather than dangerous hitchhiking. If the idea feels safe and useful, adoption begins.

Trusting the platform

Platforms then reduce friction: prepaying through BlaBlaCar or Alipay, verifying IDs on Airbnb, or building escrow functions. Each removes barriers so curiosity becomes commitment. The platform’s job is pragmatic—design processes that make risk minimal and interaction smooth.

Trusting the person or machine

Finally you judge the individual or AI delivering the service. Ratings, reviews, badges, and verified data points close the loop. Botsman stresses the significance of social proof and influencers—when pensioners adopted TransferWise, they legitimized peer-to-peer finance for skeptical users. Likewise, early adopters of self-driving cars project confidence based on “what’s in it for me” (the WIIFM factor—saved time, convenience).

Designing trust, therefore, means designing familiarity, frictionless interaction, and credible proof. Miss one and public confidence shrinks. For innovators, the Trust Stack is a diagnostic tool: check if each layer aligns with human psychology before expecting mass adoption.

Takeaway

Building trust isn’t marketing; it’s architecture. Every successful idea climbs through stages of reassurance, proof, and social validation before it becomes an everyday habit.

Once you understand this structure, you see why some startups scale rapidly while others remain curiosities. Trust isn’t viral—it is engineered step by step.


Platforms and the Price of Accountability

Botsman argues that the platforms shaping modern trust also blur responsibility. When Uber or Airbnb connects millions of strangers, who takes the blame when things go wrong? The Kalamazoo shootings in 2016 exposed the limits of platform neutrality: Uber treated drivers as independent contractors, slowing response times despite passenger warnings. Platforms claim to “only connect” users, but they mediate more than connections—they design risk itself.

The problem of black-box design

Algorithms that match riders, rank posts or verify users often operate opaquely. Researcher Coye Cheshire calls this opacity a lack of social translucence: you cannot see how or why a system made its decision. Without that visibility, accountability evaporates. When Facebook manipulated emotional content in user feeds without consent, outrage followed precisely because the invisible manipulation breached moral expectations.

Shifting responsibility

Traditional organizations held clear liability boundaries: Tesco’s horsemeat scandal led to corporate admissions and supplier reforms. Platforms diffuse those borders, claiming neutrality even as they profit from human interaction. Botsman suggests accountability must be redesigned—through transparent algorithms, audit trails, emergency escalation paths, and real-time flagging systems that prioritize safety over scale.

Design principle

Visible accountability matters. Trust in platforms only endures when users can see cause, effect and corrective action—not just ratings or disclaimers.

Platforms now function as de facto institutions. The challenge for you as a participant or designer is to insist that they adopt the same transparency and ethical obligations expected of old authorities, even if they operate through code instead of contracts.


Reputation Systems and Their Shadow Side

Reputation is the new currency of trust—but it can both protect and threaten. Botsman explores reputation systems from Amazon reviews to darknet markets, showing their paradox: they coordinate honesty among strangers yet can also produce manipulation, discrimination and coercive surveillance.

Reputation replacing institutions

Silk Road’s illegal marketplace used ratings and escrow to enforce reliability without law enforcement. High-rated vendors earned premiums because reduced uncertainty carried value. Community policing and feedback loops kept fraud low until greed broke the system (exit scams, sock-puppets). Reputation can replace legal structure—but only temporarily, until opportunism enters.

Two-way ratings and bias

Platforms built reciprocity—Airbnb hosts rate guests, Uber drivers rate riders—to align incentives. Yet studies (Harvard researchers Benjamin Edelman and Michael Luca) reveal bias: minority users get lower ratings and fewer bookings. Reviews can turn into threats or transactions rather than authentic signals. Trust scores concentrate social power among those already privileged.

The rise of social scoring

China’s Social Credit System scales reputation into governance. Sesame Credit aggregates behavior, consumption and relationships into numbers that decide mobility rights and visa priorities. The game-like interface hides authoritarian logic: good behavior earns perks; low scores restrict opportunities. It’s “Yelp reviews with the nanny state watching,” as Rogier Creemers put it.

This drift from voluntary rating to enforced evaluation turns trust into obedience. Botsman’s warning is clear: reputational data becomes political when used punitively. The same mechanisms that enable peer commerce can discipline entire societies if unaccountable authorities run them.

Moral insight

Reputation systems teach that trust always carries power. Whoever defines credibility defines freedom. Design must include the right to contest, correct and opt out.

For global users, the lesson is vigilance: demand transparency in scoring logic, appeals for harm, and humane metrics that reward fairness instead of conformity.


Technology, Ethics, and People‑First Trust

Botsman ends with optimism—but conditional optimism. Distributed trust can empower billions if guided by empathy, transparency and shared governance. Technology is not destiny; it is design. Projects like Shivani Siroya’s Tala illustrate how data can rebuild trust from the ground up rather than exploit it from above.

Human-centered innovation

Tala analyzes mobile behaviors—call patterns, timing, app usage—to score credit for the unbanked. Its repayment rate above 90% reflects how genuine understanding fosters trust. By designing reminders that align with customer habits, Tala converts data into dignity rather than surveillance.

Ethical infrastructure

Botsman advocates practical guardrails: transparent algorithms, appeal mechanisms, portability of data and independent auditing. Thought leaders such as Mark Meadows and Stephen Cave call for “bot license plates” and systems that declare uncertainty instead of pretending infallibility. Trustworthy technology must show its reasoning and its limits.

Mutual accountability

Kevin Kelly’s idea of “coveillance”—mutual watching rather than top-down surveillance—fits Botsman’s repair ethos. Instead of panopticons, aim for reciprocal transparency, where institutions and users monitor each other respectfully. In distributed environments, everyone becomes part of the trust network, not passive subjects under scrutiny.

Forward vision

Distributed trust succeeds only when built around human dignity. Algorithms, ledgers, and ratings are tools; empathy, responsibility and transparency are the foundations.

Your role is not just to participate but to shape. Demand that platforms design with people in mind, hold technologies to moral account, and remember that confidence in the unknown must always remain a human achievement, not a mechanical illusion.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.