This article was first published in GK Source in 2005.
The strength of the American economy over the next 20 years depends largely on our ability to keep our productivity growing. And productivity grows when a large set of novel technologies changes business practices and creates new industries. Interest rates, deficits, Federal Reserve policies—those will all make a difference, of course, but they won’t be the driving force. That force will be technology and its ability to transform the economy. It’s too early for biotech and nanotech to transform anything—their time has not yet arrived. So the main hope for future economic golden eras remains that tarnished cluster of technologies we call information technology.
Does information technology still have the oomph to power the economy? Its critics tell us no—that the computer’s days of greatness are over. “The IT buildout is much closer to its end than its beginning,” says Nicholas G. Carr in a recent article in Harvard Business Review. “The opportunities for gaining IT-based advantage are already dwindling.” And Oracle’s Larry Ellison declares that the IT industry “is as large as it’s going to be.”
If you see the digital revolution as the adoption of digital devices or the buildout of its Internet infrastructure, you would have to agree. You’d have to ask just how many more teraflops and servers and fiber-optic cables business really needs. But that is a limited way to look at information technology. It is better to see the digital revolution as merely the latest in a series of historical technology revolutions. In the past, each of them has gone through a predictable sequence of initial technological turbulence, media glamor, and investment boom. Then a crash. After the crash, the revolution seems to be finished, but in each case, there follows a massive buildout in which the new technology deeply transforms the economy and produces decades of prosperity. Britain’s railway revolution underwent its investment frenzy in the mid-1840s; then a crash in 1847. There followed a pattern we would recognize today: disgraced executives hauled before government inquiry boards, stocks worth 15% of their peak value, a collapse in investment. Yet in the decades after the crash, track miles in Britain increased by a factor of ten. In 1859 the U.S. went through an echo of Britain’s experience. “Our railroad system has cost more than $1 million and has brought ruin upon nearly everyone connected with it, the nation included,” Henry Carey Baird, an economic commentator, said at the time. But railroads in the U.S. in the next four decades increased their track mileage by a factor of eight.
More than that, railroads became an engine of growth: They brought a whole new era of economic expansion. They opened the Western states to commerce with the rest of America, they helped bring a massive steel industry into being, and they cheapened the economy’s inputs—the iron ore, timber, leather, and other raw materials that industry needed. Such structural changes happened gradually, and at the time were almost invisible. But they were powerful. In 1860, before the railroad buildout, by international standards, the United States was a backwater economy. Some 40 years later it was the largest economy in the world.
The information revolution, like the railroad revolution before it, is causing deep structural transformations in the economy—three transformations in particular.
First, with the arrival of networks of all kinds—wired and wireless—everything is getting connected: devices, systems, machines, business processes, even networks themselves. Information technology’s task these days is to get these items to “converse” seamlessly and remotely with one another. As a result, processes that were carried out by humans 15 years ago now become interactions among devices that talk with one another, figure appropriate action and execute.
Aircraft navigation used to consist of a cockpit crew member doing hand calculations and passing them to the pilot. Now it is a conversation among an onboard GPS system, several stationary-orbit satellites, ground stations, and an aircraft control system. Architectural design was formerly an interaction between a human and a physically drawn design. Now much of it becomes a conversation among a CAD program, a digital rendering of the building, a database that tracks the design’s materials and costs, and further databases of building regulations. “In the 1980s,” says Paul Saffo, director of the Institute for the Future in Menlo Park, Calif., “business was about people talking to other people. Now it’s about machines talking to machines.”
Second, digital technology is making business smarter. Cheap sensors—tiny processors that can see, listen, and pass messages wirelessly to one another in sensor networks—are becoming available. When attached to grocery items, trucks, or machines in factories, they too can converse—and collectively provide intelligent action. Savi Technology of Sunnyvale, Calif., has added sensors to the containers the Defense Department ships to its troops worldwide, including the Middle East. Each sensor tag stores a list of its container’s contents, its current location, and its planned routing. When a container passes through a port—Rotterdam, say—its tag wirelessly senses it, updates its whereabouts, and alerts the next nodes in the Defense Department’s supply chain. Those alert further control points to expect its arrival. “No part of the system is necessarily that smart,” says Lance Trebesch, Savi’s vice president of security. “But the overall system is very, very smart.” The U.S. General Accounting Office estimates that if the Defense Department had had such a supply chain in effect during the Gulf war, it would have saved $2 billion by keeping better track of shipments and inventory.
We are so used to such advances that we scarcely notice them. But consider their consequences. Before the digital revolution, businesses took in raw materials, processed them, and sold their output to other businesses. Each was pretty much a stand-alone unit in an input-output economy. Today, with suppliers, production processes, customers, and shipments all digitally connected, the economy is becoming a vast collection of automatic conversations, with myriad devices sensing, querying, and messaging one another—and taking action.
Which brings us to our third transformation: Digital technology is helping birth completely new subindustries. That would not happen if traditional industries merely adopted computation to enhance their existing tasks. But industries don’t adopt information technology; they encounter it. That is, they come up against information technology in a way that changes them. And in doing so, they combine some of their operations with some of digitization’s to create novel activities—and new industries. Thus, the movie business encounters digitization and uses it to endlessly morph images, warp them, composite them, color-saturate them. The result is Shrek, the Matrix movies, and an entirely new digital special-effects subindustry. The banking industry encounters digitization and uses it to figure smart combinations of options, futures, swaps, and other derivatives. That creates modern financial risk management and much of the huge financial derivatives industry. The pharmaceutical industry uses digital operations to explore molecular combinations for the design of new drugs. Traditional fields in science create new subfields that way too. Genetics combines molecular biology with digital manipulation to create genomics—and in turn, a potentially large future industry built upon gene diagnostics and gene therapies. What digitization provides, par excellence, is functionalities—operations carried out on things. They are not merely traditional operations speeded up by computers. Many—GPS navigation, for instance—are novel operations created by digitization. And impossible without it.
At its heart, the information revolution is about transformation—transformation of the very structures and processes by which the economy works. The productivity statistics corroborate that. Since 1995, output per hour has grown 2.5% annually, with the most computer-intensive industries posting the greatest gains. Even in the gloom of last year, productivity—think of it as an index of change in business practices—rose 2.8%.
We can expect such growth to continue because it will take decades for the digital transformation to work through.
What’s going on now is the slow, barely perceptible building of a NEURAL SYSTEM FOR THE ECONOMY
One thing I’ve learned from looking at past technology revolutions is that the deeper the transformation, the more slowly it takes place. Electric motors became available around 1880. At that time all machines in a factory were powered by a single, lumbering, steam-driven motor. The new electric motors could each power a single machine—a weaving loom, for example—allowing flexibility and potential cost savings. But their proper use required redesigning factories and production processes, and that took more than 40 years. Economic transformation is slow not because it requires new equipment but because it requires new—and often not obvious—ways to organize business.
Who will benefit from the digital transformation? Technology companies will, of course, but only to some degree. They are mainly the suppliers of tools. The main beneficiaries will be the rest of the economy—the Wal-Marts, Fords, and FedExes of business that use the new technology most effectively and intelligently.
There’s one dark spot—at least for the U.S.—in this positive picture. Technology itself is steadily migrating offshore. Countries that once seemed unlikely producers of high-tech—India, China, Finland, Ireland—are gaining markets in everything from the production of digital processors to cell phones to software. That seems to endanger America’s international competitive position, but I don’t think the U.S. should try to prevent it. For one thing, the U.S. needs good trading partners, and prosperous neighbors make politically stable neighbors.
But more than that, as the U.S. endures a never-ending squeeze at the lower end of advanced technology, I believe it will retain its position at the top. One reason is that advances in technology proceed from advances in science, and the U.S. has sufficient leadership there to carry it for several decades more.
Another is that the creation of advanced technology—and I mean the bringing into being of truly high-level, on-the-edge, novel technology, and not just the manufacture of it—cannot simply be put in place at any time by any country. The ability to generate that kind of technology builds over decades. It’s a tradition that achieves a momentum and stays rooted in particular places. Real high tech is not mere knowledge lifted from technical journals and applied to some purpose; it is craft—deep craft—rather like the practice of cooking at the Cordon Bleu school in Paris or of making violins in the 1700s in Cremona, Italy. What counts is knowing what precise methods work, why they work, what parameter settings to use, what new principles look promising to cut through the inevitable obstacles, what paths to pursue next. “What’s also really useful is knowing what doesn’t work—and that’s never published,” says Mark Yim, who leads an advanced robotics group at PARC (formerly Xerox PARC), the legendary technology lab in Palo Alto.
That sort of expertise is not just raw knowledge, publicly known and easily transferable. It is a collection of knowings—of 1,001 particular details and methods—that resides implicitly in people’s minds and is closely guarded. It builds upon itself within small groups in particular high-tech labs and in particular localities, so that once a region—or a country for that matter—gets ahead in a set of specific advanced technologies it becomes difficult to challenge. The detailed expertise needed to push the edge is simply not available outside. “People have tried to replicate PARC and its culture in the past,” says Johan de Kleer, manager of PARC’s systems and practices lab, “but nobody has succeeded.” Once technologies become more routine, of course, they are easy to reconstruct, and other places can develop them.
That tells me that the U.S. can retain its leadership in the new important technologies, and therefore its international competitive position, provided that it pays close attention to education and advanced science, and to its small but potent high-tech cultures. Of course, there will still be competition. Other countries do have first-rate science bases and will certainly challenge the U.S. in particular technology markets. But when it comes to the deep craft of technology, it’s America’s game to lose.
When technology crashed a few years ago, the glamor of digital technologies wore off, and it will never come back. The new period will be one of hard work. But beneath the surface storms and uncertainties, the digitization of business is steadily transforming the economy, and the next two decades will see continually rising productivity, increasing growth, and the inception of new industries. That warrants a return of quiet, well-grounded confidence. But not complacency: The U.S. can always flub its advantage by careless government policies and inadequate attention.
What is going on, like all deep change, is slow, almost unnoticeable. But it is not the mere adoption of computing, nor the building of an information infrastructure. It is something more profound—the building of a neural system for the economy. It parallels the Industrial Revolution, whose machines provided energy sources—a muscular system, if you like—for the economy. Of the two revolutions, the digital one will turn out to be deeper. And it is still only beginning.
W. BRIAN ARTHUR is Citibank professor at the Santa Fe Institute.
His website is www.santafe.edu/arthur.