Ignorance, the Ultimate Asset
As dozens of venture capital firms fret about A.I., we reprise our decade-old celebration of uncertainty as the predicate for innovation.
This article was originally published at The American on January 24, 2013. It summarizes our basic view of tech innovation and the economy. Low-entropy knowledge platforms, we argue, are the foundation for high-entropy messages of truth, enterprise, and growth. A.I. is a crucial new knowledge platform. And yet the world frets over A.I. “safety.” Dozens of venture capital firms just signed a Responsibility Protocol to shackle A.I. The key risks, in our view, are not killer robots or runaway super-intelligence but regulatory capture, industry concentration, and political censorship of A.I. platforms. We’ve got much more coming soon on the many facets of A.I. – its impact on productivity, growth, jobs, investments, sectors like healthcare, inputs like electricity and GPUs, and of course the “safety” and censorship questions. But first, we reprise the basics.
Ignorance, the Ultimate Asset
The economy is stagnating, but there is no broad agreement as to why. Democratic policy hasn’t worked in the last few years, but Republicans governed when the Panic and Great Recession began. Some economists say negative real interest rates inflated the bubble. Others say high interest rates precipitated the crash. Inflation, for sure, is now the key danger – or is it deflation? The problem is really technology: we’re in a four-decade innovation rut. No, comes the rebuttal, technology moves too fast for workers to keep up. For some, Keynes is back! For others, Keynes has been proven wrong – yet again.
Economic and political arguments aren’t going away. But fresh perspectives may be helpful. When I think about what the economy looks like, I start with information. And when one starts with information, one starts with Claude Shannon.
“I think that this present century,” Shannon remarked in 1960, “will see a great upsurge and development of this whole information business.” The modest Midwestern mathematician may have understated the case just a bit.
Indeed, Shannon is now known as the father of the information age, and the “knowledge economy” has become a dominant marker of our time, perhaps on par with the Industrial Revolution. How ironic and discouraging, then, that our economic models and policies don’t more fully reflect this pervasive force.
Shannon lived the information revolution. As a boy, he built a telegraph to the next farm using a barbed-wire fence. As an MIT graduate student in 1938, Shannon wrote a master’s thesis on how the combination of Boolean algebra (AND, OR, NOT) and binary math (1s and 0s) could yield a new kind of electronic circuitry. He thought it could change the existing telephone network (and it did). He had, however, done something much bigger, essentially defining the digital computer before we had built or really conceived one.
Most economists have now come to believe that economic growth is chiefly a function of better ideas, or innovation. How, then, do we still disagree about so much?
A decade later, at Bell Labs in New Jersey, just as William Shockley, John Bardeen, and Walter Brattain were building the first transistor, Shannon, working quietly down the hall, was publishing an even deeper, more general work. In The Mathematical Theory of Communication, Shannon defined information, quantified it in “bits,” and described an elusive concept dubbed “entropy.”
He said you could compress information with codes that replicate the actual message. He said you could conquer noise with redundancy. He showed how bandwidth was the key enabling (and limiting) factor in data transmission. These simple insights were the basis, initially, for digital phone networks and linguistic analysis, and, later, the foundation of the digital economy – computers, mobile phones, the Internet, and digital storage media of all kinds, from CDs to flash drives. Information theory is now also the basis for genetics and proteomics – DNA codes for amino acids.
Shannon’s breakthrough flowed from a bold insight: he focused not on information but entropy. Information is what we know; entropy is what we don’t know.
What You Don’t Know Can’t Hurt You
A cousin of thermodynamic entropy, which refers to atomic disorder in a physical system, Shannon’s entropy measures the amount of uncertainty in an information system. Shannon’s entropy can thus be described as the potential for news or surprise. It is invisible information.
Entropy depends on the degrees of freedom enjoyed by the sender of a message. If the range of possible messages sent along a communication channel is slim, there’s little room for surprise. If the sender can choose among millions of messages, it’s more difficult to predict what will be said. A parakeet that knows two words is a low-entropy source. William F. Buckley, Jr., with his vast vocabulary and multifaceted contrarian ideas, was a high-entropy source.
It’s easy, moreover, to compress that parakeet into a computer program. If I send a “1,” say “Polly.” If a “0,” then “cracker.” Buckley, on the other hand, is very difficult to describe even in an infinite string of bits. A pattern is easily compressible: for 10101010…, just print 10 and repeat. A more random string – 101110100011… – requires more bandwidth or storage space to represent with high fidelity.
The world is based on expectations. We feel deep emotion when we are pleasantly surprised – or our hopes are dashed. Financial markets are governed by “expected returns” and “expected volatility.” Neuroscience increasingly shows our brains work by detecting deviations from expected patterns. The known pattern is information. The potential for deviation is entropy.
George Gilder, the technology author and venture capitalist, offered a summary insight into information theory: “A high-entropy message requires a low-entropy carrier.” A voice conversation is best heard over a channel with little static. Writing appears most clearly on a blank white page.
The complexity of the economy will always outpace our ability to analyze it. When we can fully grasp it, we’ll know the end is near.
The entire digital economy is based on this principle. On microchips, electrons move at gigahertz frequencies. We can only read and manipulate them, however, because of the supreme regularity of the silicon substrate. Likewise, the fiber optic glass that carries data across the globe is so pure, you could see through a window three miles thick. The lasers traveling through the glass, moreover, are themselves the purest form of light imaginable. This profound regularity is essential to transmit highly irregular, high-entropy messages – say, YouTube videos.
It’s a long way from Adam Smith to YouTube. What is the link?
Innovation and Entrepreneurs at the Center
For a couple of centuries, economic analysis focused on land, labor, and capital. Early last century, Frank Knight added a fourth key factor – scientific knowledge – and also put entrepreneurs at the center of this fast-moving, uncertain world. Joseph Schumpeter, likewise, emphasized entrepreneurship and technical change. Later theoretical and empirical work showed Knight and Schumpeter’s descriptive insights to be true. In the 1950s, for example, Robert Solow formalized the labor and capital model of economic growth. After comparing the model to the real world, however, some 85 percent of economic growth went unexplained. This “residual” was a generic technology factor, a sort of steady, predictable march of general knowledge. In the 1980s, Paul Romer said this technology wasn’t an afterthought, it was the primary product, the purpose, the driver of the economy.
Most economists have now come to believe that economic growth is chiefly a function of better ideas, or innovation. How, then, do we still disagree about so much?
Our existing economy is based on stable, predictable foundations: roads, supply chains, efficiently run firms. Economic growth, however, is about what’s new and surprising. It is propelled by new technologies, new products, new firms, and creative ideas. The chief proprietors of “new” are entrepreneurs and their ventures. The entrepreneur generates new information through both the success and failure of experimentation. The more experiments attempted, the greater the probability of creating new knowledge. We learn what does and doesn’t work, and why.
The creative process is often messy and inefficient, chaotic and wasteful. It is often invisible or, when we can see it, difficult to describe. We cannot write computer programs to execute creativity. But the result of better ideas or new technologies is surprising new efficiency that anneals the platform into an ever-flatter, more solid, lower-entropy substrate. New knowledge reduces uncertainty and improves the efficiency of the larger economy. A more predictable world allows entrepreneurs to take more far-flung risks, explore unknown arenas, and specialize more deeply.
Platforms and Progress
These innovations, often optimized by big firms, reduce the entropy of the platform – which could be an organization, a communications network, or an entire economy – and thus further enable the high-entropy messages of the future. The cascading process continues.
A more predictable world allows entrepreneurs to take more far-flung risks, explore unknown arenas, and specialize more deeply.
It is therefore not surprising that the world’s wealthiest companies – and countries – are essentially the best platforms for innovation. Apple created the iPhone ecosystem that in just four years has spawned 750,000 new software apps that have been downloaded 40 billion times. Google built a “knowledge platform” – a vast Internet infrastructure of computers, storage, and bandwidth – whose main purpose is the efficient collection and diffusion of information. Apple didn’t have to think up all those new apps itself; Google didn’t create all that diverse content.
The United States is itself a knowledge platform – a nation built on the predictable rule of law, a large common market where people, goods, and information could flow freely, a place where relative political stability in the core unleashed unlimited diversity and experimentation among individuals and groups.
Knowledge platforms multiply the degrees of freedom enjoyed by their users. Gutenberg changed the world with a knowledge platform – movable type that spread books and learning to the masses. New knowledge of mechanical power then freed the masses from agriculture and allowed them to pursue millions of new niches, each one a journey into the unknown. Quantum knowledge built the microchip and then the Internet, so far the ultimate platforms for creativity and discovery.
The price system is a knowledge platform that efficiently integrates and transmits information. It is dependent on a predictable unit of account, which is why stable currencies promote enterprise and chaotic currencies promote chaos. When a monetary authority ensures money’s value, businesses and consumers can make decisions based on the intrinsic value of goods, services, inputs, and outputs, and they can do so with reasonable time scales. They can focus on innovation. Feedback – sales, profits, losses – imparts meaningful information. On the other hand, when a monetary authority is itself a primary source of entropy – when the value of money is uncertain – energy and capital flow disproportionately toward financial hedging activities. These activities yield profits for some but don’t create much new knowledge for society.
The Panic and Great Recession is a good example of information breakdown. The monetary authority often alters the value of money with the goal, like that of other central planners, of reducing risk for companies and consumers. The perverse outcome, however, is often increased systemic risk. The notions of a Greenspan Put or Helicopter Ben created not a neutral platform for enterprise but a massive one-way bet on the value of money itself. The result was less enterprise and more “finance,” which before the Panic accounted for a historic 35 percent of the economy’s profits.
The housing bubble was not about “bankers taking too much risk.” It was about one-way bets, the antithesis of risk. Financial players had few degrees of freedom. “As long as the music is playing,” Citigroup’s Charles Prince said, “you’ve got to get up and dance.” Profiting with the music, and getting bailed out when it stopped, the only risk to bankers was in not “dancing.”
The Fed-Treasury weak-dollar policy cheapened credit and inflated hard assets (homes, oil). Basel banking guidelines prescribed a gluttonous diet of “triple A” mortgage securities. Washington subsidized Fannie and Freddie, who subsidized home-mortgage debt. Mark-to-market accounting drove lending on the way up and decimated lending on the way down. It prevented investors from judging the value of individual asset pools and financial firms. Interacting with capital reserve requirements, it ignited a fire sale. Bankers had been turned into predictable parakeets. In effect, there was more complexity in the top-down rules than in the bottom-up markets. Exactly the opposite of what we want: simple rules for a complex world.
Why do we pretend we can see the future – and plan it from on high? Why is it so difficult for economists and policymakers to resist this fatal conceit?
The world is inherently risky and uncertain. Bad things happen. We don’t know if investments or start-ups will succeed. When risk and uncertainty are decentralized, however, we get lots of experimentation and lots of small failures. We learn and move on, better prepared for the next try. The centralization of power, information, and money – and thus the centralization of risk – is more likely to yield widespread, systemic failure.
With more neutral monetary, housing, and accounting policies, investors would have had to weigh more factors. More investors would have been on either side of more diverse “bets,” leading to more, smaller, and thus non-systemic successes and failures. More importantly, non-financial activity – industry, technology, services – would have been far more robust.
Complexity: An Essential Good
Many of Nassim Nicholas Taleb’s insights from The Black Swan and Antifragile are components of this larger paradigm. So is Brink Lindsey’s new ebook Human Capitalism. They acknowledge the fundamental, building, and nested complexity of our world. Complexity, moreover, is not something to be rectified – it is an essential good. It is both the source of real growth and a governor on unsustainable froth. Complexity is an error-correcting code. Antifragile’s subtitle – Things That Gain From Disorder – highlights the importance of bottom-up development and the resilience that accompanies decentralization and competition. Taleb, however, is so consumed with the downsides of a complex world and advises strategies so “hyperconservative,” that he often ends up preaching futility and paralysis. The Santa Fe Institute, likewise, has for years looked to complexity as the next economic paradigm but has mostly emphasized the chaotic nature of the economy and has thus run into explanatory dead ends.
That’s because intelligence, effort, and entrepreneurship are real. Friedrich Hayek’s “spontaneous order,” Santa Fe’s “chaos,” and Taleb’s “black swans” are alluring descriptions – but only when viewing earth from outer space. Perspective matters. “Emergent phenomenon” is another phrase for something we don’t understand. Some interpretations of “spontaneous order,” “chaos,” and “black swans” can deny the purpose and intelligence of individual actors. Yes, the world appears random from on high. But the farmer, architect, chef, and software developer don’t produce goods without thought or effort. They make choices. They have plans. They organize resources. They are creative. Their work produces new knowledge and new value.
Many economists, even libertarians, fetishize markets but downplay the importance of entrepreneurs. Markets are indeed crucial. But too often advocacy of markets is merely an excuse to impose regimes of “perfect competition” – which by definition deny differentiation, uncertainty, and thus the possibility for growth. Leonard Read’s famous “I, Pencil” is a brilliant fable of our complex economy, of “spontaneous order.” But without an emphasis on the individuals and companies that design and produce the components and the end product – the pencil – a reader might be lead astray. The hyper-financialization of our economy these last few years is an example of elevating abstract markets over actual enterprise.
Too often the fruits of capitalism are taken for granted, technology and wealth seen as exogenous givens. They are assumed inevitabilities of our “spontaneous” world, thus easily rearranged and redistributed from on high. This view denies the entropy of enterprise.
Shannon’s breakthrough flowed from a bold insight: he focused not on information but entropy.
Glen Weyl, the brilliant young economist at the University of Chicago, surmises that our heroic new data-gathering and analytical tools wipe away Hayek’s “knowledge problem.” Hayek argued that society is too complex for a central authority to understand, let alone successfully manipulate, its billions of psychologies, decisions, prices, and ideas. But “information technology,” Weyl counters, “fundamentally challenges the standard foundations of the market economy.” Governments “will harness the power of the data and computational power… to provide increasingly precise and accurate prescriptions for economic planning.”
Good luck. Millions of agents are also collecting, creating, analyzing, and acting on information, including incorporating the planners’ policies into their own plans. Understanding what happened is useful. Yet with all the economic and financial data at our disposal today, our brightest minds do not agree on the past. Figuring out what should happen so bureaucrats can plan the future is an even more difficult and unlikely task.
A truly entrepreneurial venture, or an invention, is not compressible – it has never been done before. A computer program or string of bits shorter than the actual bits needed to describe the venture could not be written before the venture or invention existed. Once such a string were written, you would have invented it. If a single new idea is not compressible, surely the world is not compressible. The very technology that empowers the bureaucrat further complicates the world. Big Data is not the savior of Big Government.
The complexity of the economy will always outpace our ability to analyze it. When we can fully grasp it, we’ll know the end is near. Planning the future necessarily limits the degrees of freedom and thus possible outcomes. Individuals, companies, and, yes, nations do plan – thus limiting degrees of freedom below them in the hierarchy. The higher up the stack the planning takes place, however, the fewer degrees of freedom for the whole system, the less overall uncertainty, and the less chance for learning, surprise, and growth.
Some hierarchical rules and standards actually do expand the degrees of freedom of agents below. Common languages – English or TCP/IP, for example – impose certain constraints but unleash creative communication. Standard weights and measures restrict our ability to define our own gram or meter but encourage cooperation. Good rules yield far more degrees of freedom than they restrict. Political theorists might see an analogy to ordered liberty.
Embracing Ignorance and Free Enterprise
Why then do corporate and government bureaucracies so often go beyond basic rules and standards? Why do we pretend we can see the future – and plan it from on high? Why is it so difficult for economists and policymakers to resist this fatal conceit?
Because of this simple but counterintuitive idea: ignorance is the key to the knowledge economy. And ignorance is uncomfortable. What if the future doesn’t go as planned? What will the future do without my guidance? These are understandable impulses. But as the physicist and philosopher of science David Deutsch writes, “The unpredictability of the content of future knowledge is a necessary condition for the unlimited growth of that knowledge.”
Will our economy resume growth in the short term? Is rapid growth still possible over the long term? Is there a fundamental limit? The New Normal says U.S. growth will fall to around half our historical per-capita trend – to 1 percent from 2 percent. Economist Robert Gordon, in his new paper “Is U.S. Growth Over?” thinks the next several decades will be even worse – maybe just 0.7 percent.
The hyper-financialization of our economy these last few years is an example of elevating abstract markets over actual enterprise.
Our information approach says the growth rate is dependent on the entropy of the economy – or its potential to create new ideas. Which in turn is dependent on the number of actors and the degrees of freedom they enjoy, the number of links among them, the “bandwidth” of those links, and the noise that interferes with efficient information transfer. Developing economies can grow by copying. They learn by looking around. Economies operating at the technological frontier, however, must innovate.
A thermodynamic view of the world sees only potential for arbitrage. An intelligent, knowledge-centric economy, on the other hand, is creative. If we lived in a purely thermodynamic world, the United States would be in trouble. Developing nations would copy our technology and produce goods with lower cost labor. This is, of course, part of the story – but not the whole story. Fortunately, an economy of mind, such as ours, is a quantum system, not merely a thermodynamic or mechanical one. As MIT quantum information physicist Seth Lloyd notes, “quantum mechanics, unlike classical mechanics, can create information out of nothing.”
The New Normal (or Worse) scenarios are pessimistic. Yet they may accurately reflect a diminution of economic entropy – our capacity to generate new ideas. For example, is American education producing enough knowledgeable citizens, who know how to learn? Is centralization of power and information pushing actors toward unproductive activities, instead of new discoveries? General Electric’s reported 950 employees in its tax division alone suggests one answer. Are we adding high-entropy actors to our economy via smart immigration policy? Are we steering would-be entrepreneurs down dead ends with subsidies for particular technologies or business models (think green energy)? Will rising tax rates reduce the information processing rate of the economy?
Claude Shannon warned against adopting information theory for fields beyond communication. Yet he himself used it to analyze the stock market, and today it is used in neuroscience, molecular biology, and across the universe of statistics and Bayesian analysis. Stanford’s Thomas Cover, who died last year, used information theory to create universal financial portfolios. And the information theorist E.T. Jaynes, a Shannon disciple, believed it applied to the broad economy:
Economic stagnation … can have two quite different causes: loss of entropy gradient, and loss of dither [or animal spirits]. Without an entropy gradient, the sense of direction is lost and the system drifts aimlessly. Usually one calls this motion ‘random’ but what we really mean by that is ‘determined not by any macroeconomic variables but by unrecorded details of microeconomic variables.’ When the government changes policies, it is changing the entropy function.
After decades of work in the field, Robert E. Lucas Jr., the world’s dean of growth theory, is sharpening his focus on knowledge and information. In a 2009 paper called “Ideas and Growth,” Lucas offered a model of growth based on three basic parameters: the speed at which agents process ideas, or a “learning rate”; the average quality of each agent’s environment; and the diversity of his environment. One could be forgiven for seeing a communication function: an information rate in the presence of noise and other connected, intelligent nodes. In a newer paper, with Benjamin Moll of Princeton, Lucas gets at the virtuous cycle of efficiency and creativity outlined above. Now his agents allocate time to either efficient production or new-idea generation.
We don’t know with precision what the potential of the economy is. The worst thing we could do, however, is hunker down in preparation for the New Normal (or Worse), insisting on security and certainty. If our entropy analogy is useful, and if the Lucas knowledge models are correct in their essentials, then unleashing entrepreneurs is more crucial than ever.
“The ability to create and use explanatory knowledge,” concludes Deutsch, “gives people a power to transform nature which is ultimately not limited by parochial factors, as all other adaptations are, but only by universal laws.”
This information business is not nearly exhausted, because ours has always been a knowledge economy. And it always will be. The question is whether we are intrepid in our ignorance.
Bret Swanson is president of the technology research firm Entropy Economics LLC.
Also thinking the optimized path you suggest in your essay is best symbolized by the Taoist symbol of yin and yang, darkness and light, entropy and order, and the optimized path is the line separating the two.
Allowing death (chaos, entropy etc.) into the system allows for new life. We are stopping death (and ultimately assuring its future prevalence) when we allow organizations/systems to grow too large and unresponsive to the nudging away from bad ideas and the gravity of good ideas.