You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Wretched Excess

Why this recession will be worse.

After one of the best ten-year runs in economic history, the torrent of bad news flooding Alan Greenspan's office this January had to be jarring. The country had just seen its worst quarter of economic growth since 1995, and manufacturing activity had fallen to its lowest level since 1991. Spending by businesses on new plants and equipment had dropped for the first time in a decade. And the stock markets' decline, already nine months old, showed no signs of abating. So Greenspan lowered interest rates, over and over again. By mid-August, the venerable Fed chairman had cut the target federal funds rate, the key short-term interest rate, by an astonishing three points--the fastest rate reduction in almost 20 years.

Greenspan's actions weren't controversial. Cutting interest rates--that is, using "monetary policy"--is the anti-recession weapon of choice across the economics profession. Lower interest rates make it cheaper to borrow money, which encourages consumers to buy big-ticket items--like appliances, cars, and homes--and companies to invest more heavily in new equipment. Since recessions usually occur because consumers have stopped spending, leaving businesses with excess inventory and forcing them to scale back production, cutting interest rates is often the perfect antidote.

Indeed, with a little help from fiscal policy--that is, using tax cuts or government spending to put extra money in the hands of consumers and businesses--the Federal Reserve has gotten pretty good at managing recessions this way. As Northwestern University economist Robert Gordon has pointed out, recessions between 1846 and 1945 were two-thirds as long as their corresponding expansions. Between 1945 and 1982, recessions were only one-fourth as long. (America's most recent recession, in 1991, fits this pattern as well.) So despite the stream of bad economic news, economists were confident that Greenspan's rate cuts would work their magic soon enough. "We're in that awkward period where ... people are losing faith," Mickey Levy, the chief economist at Bank of America told the Los Angeles Times in August. "They say, `Monetary policy doesn't work.' But guess what? It always does."

Even the near economic shutdown that followed September 11 hasn't shaken this conviction. After two additional half-point rate cuts, Glenn Hubbard, chairman of the White House Council of Economic Advisers, insisted that "[t]he dynamics of the recovery are likely to be fairly similar to what we've seen before. I have no reason to believe [monetary] policy won't be very effective." According to a recent survey by Blue Chip Economic Indicators, most economists agree. The majority of those surveyed predicted a recovery sometime early next year, with GDP growing at a 1.4 percent annual rate in the first quarter and a 2.9 percent rate in the second. By mid-summer the economy should be back in overdrive.

If this recession is like every other American recession since World War II, that optimism is fully merited. The problem is that there's mounting evidence it's not. This time around, it wasn't a change in consumer spending that brought the economy to a standstill; it was largely a change in business spending. As anyone who has picked up a financial page in the last year knows, the story of the 1990s is that company after company bet heavily that consumers would purchase huge amounts of goods and services in the future, particularly in areas related to information technology (IT). Businesses raised vast amounts of capital, bought and sold equipment at a frenzied pace, and dramatically increased their productive capacity--only to discover that every other business was doing the same, thus increasing the nation's overall productive capacity far beyond what consumers could realistically support. When that became apparent, businesses literally stopped investing in new equipment, and the economy--whose growth in the late '90s had been driven in part by a surge in investment--suddenly and dramatically faltered.

Cutting interest rates in the middle of this type of economic slowdown, as Greenspan did this year, won't help much. As long as companies feel they already have too much equipment, no amount of interest rate cutting will induce them to buy more. And cutting rates before such a slowdown begins in hopes of staving it off--as Greenspan did in 1998--may even be counterproductive. Low interest rates at the tail end of a long boom can encourage businesses to binge on new equipment when they already have too much. Then, when the optimism finally fades, companies must shed even more productive capacity. Some companies retire equipment; others merge or go out of business. Either way, the adjustment is more painful than it would otherwise have been.

In other words, economists' current faith in monetary policy has a lot to do with the kinds of recessions we've seen in the past, and little to do with the recession we're in now. By ignoring the difference, they're not only failing to solve the problem, they may actually have made it worse.

As economists describe it, the standard, postwar business cycle begins with an expansion, when demand picks up after the previous recession. Before long, this modest growth evolves into an outright boom, in which the economy begins expanding faster than its long-term growth rate and very near its overall capacity to produce goods and services. Next is what economists refer to as the "crunch" period, during which time the onset of inflation--as consumers demand more goods and services than can be produced--provokes the Fed to raise interest rates. These higher interest rates cause consumers to scale back big-ticket purchases and, usually not long after, businesses cut back production to compensate. This stamps out inflation, but triggers a recession, as layoffs and bankruptcies ensue. The final phase, "reliquefication," describes the financial restructuring that occurs when businesses and consumers pay off the debts they accumulated during the boom period, and businesses liquidate the inventories that had piled up along the way. Some time after that happens, the assumption goes, consumers (and then businesses) start spending, and the cycle begins all over again.

Given that framework, it's easy to see how public policy can speed the recovery along: When demand slows down, you simply cut interest rates to get consumers spending again. But suppose you faced a different sort of recession. Suppose that, instead of a credit crunch, in which rising interest rates caused consumers to cut back, businesses were the ones that stopped spending first, because they realized all the demand they had been anticipating wasn't going to materialize for years (if ever), even though they already had the capacity in place--both in terms of equipment and employees--to accommodate it today. At that point, no amount of interest rate reductions would prompt more spending. You'd just have to ride out the storm.

Unfortunately, that's the situation we seem to be in. Between October 2000 and March 2001, corporate profits shrunk at an annual rate of 20 percent, as the demand for goods and services fell vastly short of expectations. Facing declining profits, businesses simply stopped spending: Investment in capital equipment froze in the first quarter of 2001, having increased by over 21 percent in the same quarter the year before. Much of the drop-off was driven by an astounding 12.8 percent decline in spending on information technology, which at the time accounted for about half of all capital investment. The impact on an economy propelled by investment in information technology was not hard to discern. From July 1999 to June 2000, IT spending had increased 16.6 percent, and the economy as a whole had grown at a rate of 5.1 percent. Over the following year, IT spending fell by 6.5 percent and GDP growth weighed in at a mere 1.3 percent.

And capital spending still hasn't hit bottom. Until now, software had been the lone safe haven amid the information technology collapse. While spending on hardware dropped at an annual rate of 31 percent in the first half of this year, software fell just 5 percent. Indeed, software remains one of the few sectors that venture capitalists are still funding. But there's reason to suspect we're on the verge of a software collapse as well. As Stephen Roach, chief economist for Morgan Stanley Dean Witter, points out, companies spent $1.40 on software for every dollar they spent on hardware between 1990 and 1997. That figure rose to $1.70 between 1998 and 2000, largely due to the rise of Internet-related business. But because hardware spending has been falling at a much faster rate than software spending, the ratio had risen to $2.09 by the middle of this year. There's no way that kind of ratio can be sustained: Businesses only need so much software to run a fixed amount of hardware.

It gets worse. At its peak in 2000, capital spending accounted for only 14 percent of corporate costs, meaning there was only so much cost-saving that restraining investment could yield. Much more daunting was the 70 percent of corporate costs accounted for by labor. In fact, as Roach has noted, the glut of capital equipment that firms accumulated in the late '90s was surpassed only by the glut of personnel they hired, again in anticipation of demand that was still years away. Much of that personnel glut came in the form of managers. By 2000 the size of the nation's managerial class had risen to nearly 20 million, up more than 20 percent since 1994. Which meant that, to really cut costs, companies would have to cut high-paying jobs. And, over the last few months, they've done exactly that. From October 2000 to February 2001, managers accounted for less than 2 percent of the 400,000 Americans who lost their jobs. Between March and June, they accounted for a whopping 64 percent of 486,000. And though the subsequent spike in layoffs is typically blamed on the terrorist attacks, its impetus clearly predates September 11. In fact, according to The New York Times, many firms actually delayed and even moderated their layoffs in the weeks following the attacks for fear of appearing callous.

But just because this recession is different from the ones we've seen in the recent past doesn't mean it's unprecedented. There is a model to explain it, and it comes from the early-twentieth-century economist, Joseph Schumpeter. In his 1939 classic, Business Cycles, Schumpeter argued that economic cycles are essentially driven by the economy's response to innovation. His idea goes something like this: When some lucky entrepreneur invents a new technology, he borrows money, starts a firm, and begins to rack up huge profits. Soon, other entrepreneurs adopt more or less the same technology and begin making huge profits of their own.

The problem is that great innovations tend to create a surge of optimism, causing what Schumpeter referred to as a "secondary wave." That is, not only do firms in the innovating industry invest great sums, but other firms begin to speculate, figuring they, too, can cash in on the new technology. To borrow Schumpeter's example, a new factory means more business for the local grocer, which means more business for the wholesaler, which means more business for the manufacturer, who, in turn, ramps up production. Unfortunately, people get carried away when making these adjustments, as do the interests that finance them. As Schumpeter put it, "many people will act on the assumption that the rates of change they observe will continue indefinitely, and enter into transactions which will result in losses as soon as facts fail to verify that assumption."

Because Schumpeter understood growth as a function of innovation, he understood recessions as inevitable periods of adaptation: An economy assimilating a new innovation accumulates deadweight that must be purged--firms made obsolete, firms that adapt slowly or poorly, and firms dependent on firms in either of these categories. The speculative phase Schumpeter identified exacerbates the problem by creating more deadweight. As a result, more people have to bear more pain--in the form of bankruptcy and job losses--for a longer period of time.

Schumpeter made technological innovation the central feature of his model because, at the time he was writing, it was the most important force propelling the economy. Beginning in the mid-nineteenth century, the growth of railroads drove the nation's westward expansion. Around the turn of the century, electricity and automobiles were revolutionizing the industrial world.

Moreover, each boom period seemed to be followed by an equally violent bust. Take the post-Civil War railroad boom. As optimism surrounding the railroad expansion grew, banks lent to companies against bonds that were later sold to the public, and speculation in railroad stocks increased dramatically. That optimism reached a fever pitch in 1869, with the completion of a transcontinental route at Promontory Point, Utah. As one contemporary observer put it, "They built [rail]roads everywhere, apparently in perfect confidence that the country would so develop as to support all the [rail]roads that could be built."

And then, in Schumpeter's words, "railroad construction had temporarily exhausted possibilities." The more than 50,000 miles of track laid in the United States roughly equaled the amount that existed in all of Western Europe, whose population was more than three times the size. Which makes it hardly surprising that the failure of the once-powerful Northern Pacific Railroad could trigger a market crash in 1873. Stock prices and land values imploded (the latter having risen wildly in areas opened up by rail expansion), and 40 percent of railway bonds became worthless over the next few years. Annual construction of track slowed from 7,500 miles just before the crash to 2,500 miles afterward. And the ripple effects produced a depression whose magnitude would not be replicated until the 1930s. The reason? As Schumpeter put it, "[r]ailroad construction ... had so revolutionized the economic system that liquidation, absorption, adaptation ... was an unusually long and painful affair."

Fast-forward 125 years and the story is eerily similar. In the late 1990s fiber optic cable seemed to hold the same limitless possibilities that rails had in the 1860s. Amid a dramatic increase in Internet use, demand for "bandwidth" (i.e., capacity for handling communications traffic) literally skyrocketed. The main reason was that the vast majority of existing infrastructure had been made from copper wire, meaning that signals had to be transmitted electronically--not the most efficient process, as anyone who has ever tried to connect to the Internet using a regular phone line can tell you. The beauty of fiber optics, on the other hand, was that they transmitted signals as light. That meant each signal encountered less resistance en route to its destination and arrived much more quickly. Meanwhile, because photons are much lighter than electrons, each optic cable could haul millions of times more traffic than its copper counterpart. In light of that advance, one widely reported estimate predicted that telecommunications companies would spend more than $1 trillion upgrading their networks over the next 20 years.

Needless to say, the possibilities weren't lost on American businessmen. In June 1998 a fledgling company named Global Crossing had laid a single, 5,000-mile fiber optic cable. Two years later it had literally crossed the globe, the size of its network soaring to over 100,000 miles. During the same period, an army of "new economy" entrepreneurs were doing the same. At last count, a rival company called PSINet had assembled a million-mile network. Another rival, Qwest, had laid more than 190,000 miles of fiber. In all, more than 40 million miles of optical fiber were laid--enough to send an e-mail to Mars and still have cable left over. And, ironically, much of it was laid along existing rail lines.

Unfortunately, as with railroads in the 1860s, no one has figured out what to do with all that cable just yet. Even under the wildly optimistic, if popular, assumption that Internet traffic was doubling every few months, it would have taken years for demand to warrant the amount of cable actually installed. A report by Merrill Lynch this summer revealed that just 2.6 percent of all the fiber in the ground was being used. The vast majority of it isn't even operational (or "lit," as they say in the industry). As a result, wholesale bandwidth prices (i.e., the fees phone companies and Internet service providers pay to send information across optical networks) plummeted by around 60 percent last year. And with it, the price of fiber optic companies' stocks. Today a share of Global Crossing, once valued at over $73, fetches a meager $1.58.

Given the parallels between today and the booms of the nineteenth century, why are economists so confident this recession will turn out like the relatively painless downturns of recent decades--recessions amenable to swift interest rate cuts by the Fed?

The answer has a lot to do with the evolution of the discipline over the last half-century. For all practical purposes, macroeconomics didn't exist as a distinct field prior to the Great Depression. The economy was seen as the sum of its individual markets, which meant no special tool was needed for analyzing it as a whole. Just as individual markets were self-correcting, the thinking went, so was the entire economy.

The Depression, of course, changed that view. A collapse in the nation's money supply triggered a severe deflation, meaning that each dollar was worth progressively more as time went by. (There is some debate as to what caused this collapse, with some economists citing a string of bank failures and others pointing to mistakes by the Fed.) With dollars worth more the longer people held them, there was no incentive to spend them sooner rather than later. Demand for goods and services plummeted, destroying millions of jobs in the process. Meanwhile, with the value of each dollar rising, it became progressively more expensive for employers to pay their employees the same number of dollars. Before long, the economy fell into a vicious cycle, as rising unemployment squeezed demand, and lower demand increased unemployment.

It soon became apparent that the problem wasn't going to fix itself. Hence the need for a radical rethinking of the way we managed the economy. The Keynesian revolution, and the subsequent monetarist counterrevolution, accomplished just that--with Keynesians (notably Keynes himself) arguing that government must spend money to prop up demand in a recession, and a few decades later, monetarists arguing that the government must ensure a stable money supply. As a practical matter, there is an obvious complementarity between the two positions: Government spending can help stimulate demand, while stabilizing the money supply arrests deflation. And today macroeconomics is characterized by a synthesis of those two components.

But something interesting happened as modern macroeconomics was being born. First, as Northwestern's Gordon has noted, in the transition to the postwar era, economists switched their focus from explaining business cycles as self-contained phenomena (as Schumpeter had) to simply explaining how various "shocks" to the economy--a rise in interest rates, a rise in oil prices--were propagated. Second, in the transition from Keynesianism to monetarism, the focus shifted again. The new goal was to define the relationship between the money supply and economic growth, and calibrate the economy accordingly. With that growing preoccupation, it's no surprise that, by the late 1960s, professional economists had reached a consensus about the power of monetary policy--that it could, for example, cure a recession regardless of its cause. By the late 1970s that confidence had pervaded Washington as well.

As we're now learning, that confidence wasn't entirely justified. But it's not just that monetary policy may prove ineffectual for the kind of recession we face today. After all, with the sort of capacity glut we're now facing, just about every policy response--tax cuts, spending increases, you name it--would prove ineffectual. The real trouble is that our faith in monetary policy may have actually made this recession worse. To see why, you have to return again to the Great Depression, whose lasting contribution to contemporary macroeconomics was the idea that severe recessions and depressions are almost always the result of policy mistakes.

According to the current consensus, the glaring policy mistake of the 1930s was a failure to stabilize the money supply to offset deflation. As a result, economists grew convinced that the money supply must be increased any time there's a serious negative shock to the economy. (The process is known as "increasing liquidity" and can be accomplished by buying up Treasury bonds, which the Fed does when it lowers interest rates.) For the most part, that has been a pretty successful approach. In the wake of the stock market crash of October 1987, for example, the country very likely avoided a financial crisis because Alan Greenspan had learned exactly that lesson.

But as our faith in monetary policy (and one particular monetary policymaker named Greenspan) grew, the determination to respond to crises slowly evolved into an attempt to anticipate, and head off, crises before they ever appeared--what financial observer Jim Grant derides as "anticipatory monetary policy." In 1997 concern about financial markets in Southeast Asia deterred Greenspan from raising interest rates despite evidence of excess liquidity in the financial system. Then, in late September 1998, Greenspan dropped the federal funds rate by a quarter-point amid continuing financial woes in Southeast Asia, a Russian debt default, and the meltdown of the high-powered--and highly leveraged--Long-Term Capital Management hedge fund. That kind of temporary move was arguably necessary to prevent a global financial crisis. The problem was that it turned out not to be temporary at all. Greenspan lowered interest rates by a quarter-point twice more over the subsequent two months, cuts he didn't begin to reverse until the following summer for fear of leaving the economy vulnerable.

The Fed finally raised interest rates back to their September 1998 levels only in November of 1999, at which point many felt that still further hikes were necessary. But fear of Y2K-related problems--problems for which he had little concrete evidence and which subsequently proved nonexistent--caused Greenspan to defer these additional hikes until February. As with the decision to keep interest rates low earlier in the year, the move was purely prophylactic.

Why was Greenspan so anxious? For one thing, he was still smarting from what former Fed Vice Chairman Alan Blinder has called the "only blemish on [his] record"--the failure to head off the 1991 recession that helped usher President George H.W. Bush from office. As one Wall Street economist put it earlier this year, "Alan Greenspan was blamed for one Bush recession. He doesn't want to be blamed for another." In addition, Greenspan had become a true believer in the power of information technology to alter the fundamental laws of economics--in particular the idea that technology had permanently increased productivity and, therefore, the rate at which the economy could grow without sparking inflation. In such a context, monetary policy seemed to hold out the promise of a painless business cycle--one in which no one ever lost his job. Greenspan practically said as much. At a Senate Budget Committee hearing in January, he explained that the difference between a mere slowing of growth and an actual recession (i.e., a period in which the economy contracts) was typically how consumers responded--the implication being that the Fed could prop up consumer confidence and thereby keep the economy growing indefinitely.

And yet, as we're discovering today, the old business cycle still exists: When the combination of innovation and optimism produces vast excesses, one way or another we have to sit tight until those excesses are wrung out. Moreover, as a purely logical proposition, anything we do to add to the excess will make the subsequent wringing-out more painful.

Which, it seems, is precisely what Greenspan did with his preemptive strikes against recession in the late 1990s. By any objective measure, the excess in the economy was already great in 1998. Between 1994 and 1998, capacity in three top technology sectors--computers, communications equipment, and semiconductors--increased more than 400 percent. Or, to take an old-economy example, in 1995 the nation had just under 27,000 indoor movie screens. By 1998 that number had risen to over 33,000, even though admissions hadn't risen nearly as quickly.

By trying to postpone the inevitable sorting-out period, Greenspan ended up vastly increasing those excesses: Capacity in the three tech sectors more than doubled between 1998 and 2000; the movie industry added 2,200 more screens even as admissions fell 4 percent. It's not hard to see why. First, money injected into the financial system inevitably gets used. When Greenspan's fear about financial crisis proved overblown, that use turned out to be investments that "were not justified by fundamentals, and not socially or privately productive," according to Ohio State University economist Steve Cecchetti. But, more generally, when Wall Street concludes that the Fed is committed to bailing it out, investors and businesses behave more recklessly than they otherwise might.

Now that the recession is underway, cutting interest rates will prevent it from being worse than necessary. While monetary policy isn't going get businesses investing in new equipment any time soon, and it won't do much to save firms that never should have existed in the first place, it can prevent sound firms from going under by helping them pay off debts. Similarly, it can help consumers to refinance mortgages and makers of big-ticket items like cars to offer cut-rate financing. Government spending might also help ease the blow--perhaps by extending health insurance to those who lost it with their jobs. But the hard truth is that we can't avoid this recession, and there's not a lot we can do to make it shorter. The people who make American economic policy may be about to learn that lesson. It's just too bad they couldn't have learned it sooner.