Why Information Grows cover

Why Information Grows

by Cesar Hidalgo

In ''Why Information Grows'', Cesar Hidalgo explores the transformative power of information from atoms to economies. This insightful book reveals how human civilization thrives by turning raw data into meaningful innovations, creating prosperity through shared knowledge and robust networks.

The Growth of Information: From Atoms to Economies

Have you ever wondered why our world seems to get more complex, not less, even as time constantly pushes things toward decay? In Why Information Grows, César Hidalgo explores this paradox by asking a deceptively simple question: how does order—what he calls information—emerge and expand in a universe ruled by the second law of thermodynamics, where disorder and entropy are always increasing?

Hidalgo argues that the key to understanding the rise of complexity—from swirling galaxies to DNA molecules to human societies and modern economies—is rooted in the same fundamental physics. He claims that information is not abstract but physical. It is the arrangement of matter, the order that transforms a pile of atoms into a car, a skyscraper, or a smartphone. This embodiment of information explains why objects, organisms, and economies require energy, solid structures, and the ability to compute to keep information alive and growing. Humanity’s unique capacity to create and manipulate information—to imaginatively embed order in matter—is what makes us extraordinary.

From Boltzmann to the Modern Economy

The book opens by revisiting the nineteenth-century physicist Ludwig Boltzmann, who first connected the physical order of systems to probability and entropy. Boltzmann knew that the universe tends toward disorder, yet he saw paradoxical growth of complexity all around—life, society, technology. His failed attempt to reconcile this contradiction marks the start of Hidalgo’s story. By blending physics and economics, Hidalgo picks up Boltzmann’s quest and expands it: what Boltzmann explored in atoms and gases, Hidalgo explores across human networks and economies.

Information as Physical Order

Information, Hidalgo reminds us, isn’t merely data or meaning—it’s physical order itself. The world’s richness is the result of how atoms arrange themselves into intricate configurations, defying the general drift toward randomness. A crashed Bugatti and a functioning one share the same atoms, but vastly different levels of organization. The loss of order is the loss of information. What makes Earth special in the cosmos isn’t its abundance of matter or energy, but its extraordinary concentration of organized structures—information stored in biology, technology, and thought.

Energy, Solids, and Computation: The Trinity of Growth

For information to grow, Hidalgo says it must overcome entropy through three mechanisms. First, flows of energy—like sunlight or geothermal heat—keep systems out of equilibrium, allowing order to manifest. Second, solids preserve information by freezing dynamic patterns into lasting structures, as DNA stores biological order and technology stores human knowhow. Third, computation—the ability of matter to process information—turns atoms, cells, and societies into engines of creativity. Together, these principles explain how complexity arises and accumulates: matter learns, processes, and replicates patterns.

Human Networks as Information Processors

Hidalgo extends this logic to the social and economic realm. Human societies grow by accumulating knowledge and knowhow—the computational engines inside networks of people. Unlike DNA or proteins, human knowledge must be embodied in brains and relationships. This reliance on humans introduces limits: we can’t store infinite knowhow individually, so we rely on networks (firms, cities, institutions) to collectively hold and process it. These social computers are the true creators of physical order, or what Hidalgo calls crystals of imagination—products that materialize human creativity in tangible form.

The Evolutionary Arc: From Atoms to Economies

By reframing economies as systems of information growth, Hidalgo connects physics, biology, and sociology into one elegant continuum. Life and economies alike depend on energy, solid structures, and computation to resist entropy and generate complexity. Our products—the cars, chips, and cities we build—are frozen packets of imagination that amplify our capacity to make further information. Prosperity, then, is not about wealth accumulation but about increasing a society’s ability to make information grow. It’s why some nations advance faster than others: they harness larger social networks that can compute collectively and sustain complexity.

Why This Matters

Hidalgo’s core proposition reshapes how you can see progress, technology, and even everyday life. Every act of creation—writing a book, coding software, planting trees—is a small defense against entropy. Each embodies imagination turned solid. By interpreting economies as systems of information growth, Hidalgo reveals that inequality emerges not from capital or resources but from uneven access to computational capacity—our networks of knowledge. Understanding this makes you see the world differently: not as competing markets, but as evolving pockets of organized complexity, each an echo of the universe’s deepest drive—to make information grow.


Physical Order and the Nature of Information

One of Hidalgo’s most striking insights is that information is physical. He moves away from the abstract, digital notion of bits in cyberspace and returns to the tangible universe where atoms, energy, and entropy reign. From a crashed Bugatti to the DNA sequence in your cells, information is the arrangement of matter—the difference between disorder and complex functionality.

Entropy vs. Information

Entropy measures the number of possible equivalent states a system can occupy. A disordered stadium, where fans sit randomly, has more possible configurations than one where seats spell the word “INFORMATION.” Therefore, high entropy corresponds to randomness; low entropy corresponds to order. Information, for Hidalgo, is the inverse of entropy—it’s what makes rare states happen, the improbable alignment of atoms or ideas that create something meaningful. This simple idea connects the physics of gas molecules to the making of cars, cities, or concertos.

Shannon, Boltzmann, and the Meaningless Meaning

Claude Shannon’s mathematical definition of information—the number of bits needed to describe a message—echoed Boltzmann’s formula for entropy. Both deal with probability and arrangement, highlighting how physical systems encode configurations. Hidalgo builds on this equivalence to show that information has nothing to do with meaning. A tweet full of random characters carries more “bits” than a simple message, but it doesn’t convey order. Similarly, a random mix of atoms contains less structure than a precisely engineered Bugatti. Information, in both senses, lives in ordered correlations, not arbitrary noise.

Order Embodied in Matter

To make this concrete, Hidalgo invites you to imagine your hard drive. Flip all its bits randomly, and Shannon’s formula says you’ve increased information—but you’ve destroyed the meaningful order that turned zeros and ones into photos or music. That paradox captures Hidalgo’s point: real information manifests as physical order, correlations built across many scales. It’s what differentiates a tree from wood ashes, or a functioning phone from scattered circuits.

The Rarity of Order

Throughout physics and life, ordered systems are rare because they require specific paths through chaos. Hidalgo’s Rubik’s Cube analogy illustrates that only a few sequences of moves yield perfect order among trillions of possible ones. In this sense, order emerges through computation—through matter “searching” for stable configurations. This framing transforms your view of creativity and progress: whether in nature or society, producing information is not about accumulating bits but discovering paths through disorder that lead to enduring patterns. This is the fundamental bridge linking atoms, genes, and economies.


The Physics of Information Growth

Before information could fill our planet with life and technology, it had to overcome the universe’s relentless drift toward disorder. Hidalgo follows physicist Ilya Prigogine’s revolution in thermodynamics to explain how order can emerge spontaneously within chaotic systems. Prigogine’s work revealed that non-equilibrium systems naturally generate information, and Hidalgo uses this insight to show why the Earth—and you—are anomalies in a universe otherwise doomed to entropy.

Order from Chaos

Prigogine discovered that steady states far from equilibrium can self-organize. Think of the whirlpool forming as a bathtub drains: the water’s constant flow produces a stable, information-rich structure. Whirlpools, convection cells, and even candle flames exemplify how energy flows can create pockets of complexity. Earth itself is a vast non-equilibrium system kept alive by solar and geothermal energy. These flows make our planet a perpetual whirlpool of information—an oasis of order in a universe headed toward thermodynamic peace.

Solids, Stickiness, and Permanence

Once formed, information tends to be fragile. Whirlpools disappear when the faucet closes; cigarette smoke dissipates in seconds. Hidalgo, building on Erwin Schrödinger’s ideas from What Is Life?, introduces the role of solids as information stabilizers. In crystals like DNA, structural patterns endure because thermal motion cannot easily disrupt them. Solids preserve information, offering durability and accumulation. Proteins, cities, and buildings—all solid objects—allow information to persist and recombine, enabling life and economies to grow.

Computation: Matter Learns to Think

Hidalgo’s final ingredient is computation: matter’s ability to process information. Trees, he explains, are computers powered by sunlight. They “decide” when to shed leaves or grow roots by processing environmental data. Similarly, chemical reactions can act as primitive computers that adapt based on inputs and feedback. Life itself, Hidalgo concludes, is what happens when matter learns to compute persistently. Computation—at atomic, biological, and social scales—lets matter organize itself into patterns, store these patterns, and build upon them.

Irreversibility, Chaos, and the Entropy Barrier

Through thought experiments with ping-pong balls, Hidalgo demonstrates that even if we could measure and reverse every particle’s motion, we could never reverse time: the information required would be infinite. Time’s arrow emerges from the universe’s chaotic complexity—it carries an “entropy barrier” that makes the past unreachable. This irreversible computation is what fuels evolution: every instant is a new calculation. Recognizing this, you can see that the growth of order doesn’t violate physics—it exploits the universe’s own dynamics to create islands of information that compute forward, not back.


Crystallized Imagination: How Humans Make Order

Much of the universe’s complexity is accidental, but human complexity is deliberate. Hidalgo calls our creations—products, technologies, art—crystals of imagination. They are physical embodiments of ideas first born in our minds, then frozen in matter. What separates us from other species isn’t opposable thumbs or language alone, but our ability to make the intangible tangible, to turn fiction into form.

From Apples to iPhones

Hidalgo’s comparison between the apples we eat and the Apples we use crystallizes this idea. Biological apples existed before minds conceived them; technological Apples (iPhones) existed first as ideas before becoming physical products. Humans export imagination to the physical world. Every object—from a coffee mug to a robotic limb—represents a strand of imagination solidified by knowledge and knowhow.

Information with a Source

Humans create information that originates in creativity rather than in randomness. This distinction underpins Hidalgo’s concept of imagination’s balance. When Chile exports copper to Korea, it sells raw matter; when Korea exports cars to Chile, it sells crystallized imagination. The “balance of trade” thus hides a deeper “balance of imagination.” Developed economies are rich not because they own resources but because they have large networks that embody knowhow to turn imagination into reality.

Products as Augmentation

Products amplify human capacities. A guitar lets you sing with your hands; a plane lets you fly; toothpaste lets you keep your teeth into old age. We use physical order to extend the limits of our biology, gaining new forms of expression and creativity. Economies, in this view, are vast amplification networks—systems that scale our capacity to act and imagine through crystallized objects.

Creativity and Collective Genius

By turning imagination into matter, humans become social amplifiers of genius. Kelvin Doe, a Sierra Leone teenager who made batteries from scrap materials, and MIT researchers Hugh Herr and Ed Boyden (who build robotic legs and optogenetic interfaces) illustrate this universal drive. Their work shows that progress comes not merely from ideas but from embedding imagination in matter—creating crystals of imagination that expand the universe’s repertoire of order.


Knowhow, Networks, and the Personbyte Limit

Why do some societies produce jet engines while others make shoes? Hidalgo argues that the answer lies in how knowledge and knowhow—our human computational capacities—are distributed. He introduces the concept of a personbyte: the maximum amount of knowledge and knowhow a single individual can hold. Complex products require more than one personbyte, meaning they can only be created through networks of people and firms.

Learning Is Experiential and Social

Unlike data in a hard drive, knowledge can’t simply be downloaded—it must be learned through experience and interaction. Air traffic controllers, surgeons, and musicians all develop knowhow through practice and collaboration. This social nature of learning means information is geographically biased: clusters like Silicon Valley or MIT grow because expertise accumulates through proximity and shared practice.

The Quantization of Knowhow

The personbyte frames human limits physically and conceptually. Once a product requires more knowledge than fits in one person, it must be divided and distributed among people. A team making a satellite or a symphony becomes an interconnected web where each node (person) holds a fragment of knowledge. The challenge of innovation, then, is not inventing ideas but connecting enough personbytes efficiently to reconstitute complex knowhow without losing coherence.

Networks as Collective Brains

Teams, firms, and cities act as collective computers that combine personbytes into larger computational capacities. The Beatles together were greater than the sum of their solo careers; the Apollo space program exceeded any individual’s mental limits. Understanding this helps you grasp why economic development requires social structures—trust, communication, and institutions—that allow multiple personbytes to interact and produce organized complexity.


Trust and the Formation of Social Networks

If knowledge growth depends on networks, what makes networks possible? Hidalgo answers: trust. Drawing on sociologists Mark Granovetter and political scientist Francis Fukuyama, he explains how trust reduces transaction costs and enables large human collaborations—the hidden engine behind economic complexity.

Social Networks Embedded in the Economy

Granovetter showed that most people find jobs through personal connections, not formal markets. In Boston’s apartment and labor markets, Hidalgo observes the same: the best opportunities travel through social ties. Economic systems are “embedded” in social networks that filter information and shape outcomes. These networks, built via shared foci (schools, workplaces) and homophily (similar interests), determine who learns from whom and therefore where knowledge grows.

Trust and Network Size

Fukuyama’s distinction between familial and high-trust societies explains why some nations form large professional networks while others rely on families. In countries like Japan or Germany, high trust allows strangers to cooperate through institutions, forming diverse, scalable firms. In familial societies like Italy or Latin America, trust is confined to kinship, producing small family businesses and occasional state intervention to compensate for social fragmentation. The outcome: different scales of economic complexity.

Trust as Economic Glue

Trust makes links cheaper to form. Firms in high-trust settings waste less on contracts, monitoring, or bureaucracy. The diamond merchants of New York, who trade jewels on handshake deals, exemplify this efficiency (James Coleman’s case study). Conversely, low-trust environments require heavy formal institutions and paperwork, reducing adaptability—as seen in Boston’s Route 128 tech cluster compared to Silicon Valley’s porous, trust-driven ecosystem.

Networks that Learn and Adapt

Hidalgo shows how trust not only enables large networks but also makes them adaptable. In Silicon Valley, trust allowed ideas to flow—from Xerox PARC’s GUI technology to Apple—ensuring that innovation survived even as firms changed. Economies prosper when trust lowers friction, expands connectivity, and facilitates recombination—the same dynamics that make information grow in nature. The lesson: social ties are as vital to complexity as solar energy is to life.


Economic Complexity and the Product Space

To translate networks, knowhow, and trust into measurable economic outcomes, Hidalgo introduces the concept of economic complexity—a way to quantify how much knowledge and knowhow is embodied in a country’s industries. His metaphor: economies are jigsaw puzzles of interlocking industries, each requiring distinct personbytes. The more pieces fit together, the more complex—and prosperous—the society becomes.

Nestedness: The Triangular Pattern of Global Production

When visualizing countries and products, Hidalgo finds that simple industries (like garments) are ubiquitous, while complex ones (like aircraft) concentrate in few nations. These matrices form triangular, or "nested," patterns: less diverse economies export subsets of the products made by diverse ones. Complexity, therefore, correlates with diversity and specialization. Places with many interlocking industries hold more personbytes and can make rarer, high-knowledge goods.

The Product Space: Mapping Industrial Relatedness

Hidalgo’s groundbreaking product space map connects industries that share capabilities. Countries diversify toward nearby products in this network—those requiring similar knowhow. Malaysia’s evolution from garments to electronics follows this rule. Like biological evolution, economic diversification occurs through adjacent possibilities: industries evolve by recombining existing knowledge rather than leaping across vast gaps.

Complexity Predicts Prosperity

By combining product ubiquity and network diversity, Hidalgo defines a measure—economic complexity—that predicts future income better than traditional metrics like GDP. Countries below their complexity potential (India, China circa 1985) tend to grow faster as their production catches up with their latent knowhow. Rich nations hold dense industrial networks, allowing information to grow explosively, while developing ones struggle to connect enough personbytes. Complexity, not just capital, explains prosperity.


The Marriage of Knowledge, Knowhow, and Information

In his final chapters, Hidalgo weaves together physics, biology, and economics to show that knowledge, knowhow, and information are inseparable. A seed, he explains, packs information (DNA sequences) and knowhow (cellular machinery) to recreate a tree. Similarly, societies pack knowhow and information in people, firms, and technologies to reproduce complexity across generations.

Packaging Complexity

Life reproduces because it combines instruction and implementation—DNA’s coded data and the proteins that know how to use it. Economies lack this tight coupling. You couldn’t rebuild modern industry on a desert island using only books or computers; knowhow resides in networks, not texts. Without physical and social structures to unpack knowledge, information alone is sterile—like DNA without a cell.

Loss and Persistence of Knowhow

History demonstrates how isolation destroys computational capacity. Jared Diamond’s examples of Tasmanians and Polynesians losing technologies mirror what happens when societies disconnect; complex knowhow evaporates. Attempts like Ford’s ill-fated Fordlandia show how transplanting industrial networks fails when social structures can’t unpack the encoded information. Economic ecosystems, like ecological ones, depend on continuity and interaction.

The DNA of Economies

Bringing science full circle, Hidalgo concludes that economies and organisms share the same logic: both are systems where information persists through solids (technology), flows of energy (work and trade), and computation (people and networks). The growth of information—our ability to imagine, store, and process knowledge—is the central thread of evolution. The physics that produced whirlpools and crystals also produced entrepreneurs and cities. Understanding this unity changes how you see our planet—not as just a marketplace, but as the universe’s greatest engine for making information grow.

Dig Deeper

Get personalized prompts to apply these lessons to your life and deepen your understanding.

Go Deeper

Get the Full Experience

Download Insight Books for AI-powered reflections, quizzes, and more.