Subtitled “The Evolution of Order, from Atoms to Economies”
(Basic Books, 2015, xxi + 232 pp, including 51pp of acknowledgements, notes, and index)
A few weeks ago I sat down to read the new Yuval Noah Harari book, NEXUS, and just a few pages in noticed a footnote crediting much of his understanding of the concept of “information” to César Hildago’s WHY INFORMATION GROWS. And I realized, I have that book! It’s that green one! So I got it off the shelf and browsed a bit. His thesis based on that browsing seems very similar, at least analogous if not identical, to a key idea in the Brian Greene book we read recently, UNTIL THE END OF TIME (review here) — the idea that order can build up, despite the 2nd law of thermodynamics, but only *temporarily*, even if temporary means billions of years, because of the course of the life of the *entire* universe, that order (or information) will dissolve back into entropy. (Greene calls this “the entropic two-step”.) How interesting! That meant the little green Hidalgo book has ties to two other big books. So I set Harari aside, to get Hidalgo’s ideas from the source. And here we are. How does this idea relate to “information”? How exactly does he define “information”? That’s what Harari deals with too.
(More broadly, the idea of that evolution can happen despite the 2nd law of thermodynamics, an idea disputed by naive, ill-educated people who desperately want evolution to be untrue so that can think their Bible is true, relates to a broad range of topics. And another 2015 book I haven’t read, Matt Ridley’s THE EVOLUTION OF EVERYTHING.)
So: the universe is made of energy, matter, and information, and it’s the last that makes it interesting. It hides in pockets at the universe succumbs to entropy. Nevertheless, information grows. Without summarizing the entire book, here are some key points:
- The introduction looks back at Ludwig Boltzman, who in the late 19th century tried to explain the existence of physical order. Yet his theory predicted the opposite: that order should disappear [the standard notion of entropy]. He gave up and committed suicide. Fifty years later, the study of secret codes and information by Shannon and others concluded that information is in a sense meaningless; it can be manipulated regardless of its meaning. Context and prior knowledge provide meaning, e.g. “September 11.” Shannon’s formula to most efficiently encode an arbitrary message turned out to the the same as Boltzmann’s.
- The key idea of the book is that information arises from three facets of the universe: out-of-equilibrium systems, the accumulation of information in solids, and the ability of matter to compute. An example of the second: the 21st century is quite different from the environment in which our species evolved; it’s full of objects that are physical orders of information. (Not to mention DNA, etc.) A running theme is the distinction between “knowledge” and “knowhow”; the latter is the capacity to perform actions, even if we don’t “know” how to do them.
- Information is measured by (in Shannon’s formulation) the number of bits needed to communicate the arrangement. Thus — paradoxically — a randomized set of bits contains *more* information than well-ordered files. Entropy is not the same as disorder; to Shannon, information and entropy are functionally equivalent.
- Time is related to the physical order of information, on laws that govern large collections of particles. Boltzmann’s idea became the second law of thermodynamics. Information, Ilya Prigogine explained, emerges from physical systems that are out of equilibrium. From dynamic steady states rather than static steady states. The Earth is out of equilibrium within the larger system of the universe — the sun and nuclear decay keep the planet out of equilibrium — and so information can accumulate, and become embodied in solids, like aperiodic crystals, or DNA, or all the artifacts that humans have created. The final key is that matter has the ability to compute, as in chemical systems and life-forms.
This takes us only about a quarter of the way into the book, but those are the key points I take away, given how they relate to the big themes of Greene and others. The author is as concerned with economics as with life. The balance of the book concerns products as objects that embody the knowledge and knowhow of others, and thus drive the expansion of the economy over time. He imagines the amount of knowledge a single person can control as a personbyte, and how structures vastly larger than one personbyte are built through firms (firmbyte), and study of which is now known as “transaction cost theory” or “new institutional economics.” There are ideas about why certain products are associated with various specific regions in the world; about social interactions and social networks; about how large networks are a reflection of a society’s level of trust. And ideas of economic complexity like nestedness. Economic growth is now explained by five factors: matter, energy, knowhow, knowledge, and information. Economies are like ecosystems; they don’t survive in isolation. (Would knowledge for a global economy survive in a group of teenagers?)
It all starts with those three mechanisms. And I’ll quote his final paragraph.
As the universe moves on and entropy continues to increase, our planet continues its rebellious path marked by pockets that are rich in information. Enslaved by the growth of order, we form social relationships, make professional alliances, beget children, and, of course, laugh and cry. But we often lose sight of the beauty of information. We get lost in the urgency of the moment, as our minds race like whirlpools and our lives computer forward in a universe that has no past. We worry about money and taxes instead of owning the responsibility of perpetuating this godless creation: a creation that grew from modest physical principles and which has now been bestowed on us.