You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Power Struggle

In the winter of 1984, a young scientist named Steven Chu was working as the new head of the quantum electronics division at AT&T's Bell Labs in Holmdel, New Jersey. For months, he'd been struggling to find ways to trap atoms with light so that he could hold them in place and study them better. It was an idea he'd picked up from an older colleague, Arthur Ashkin, who had wrangled with the problem all through the 1970s before finally being told to shut the project down--which he did, until Chu came along. ("I was this new, young person who he could corrupt," Chu later joked.) Now Chu, too, had hit an impasse until, one night, a fierce snowstorm swirled through New Jersey. Everyone at Bell had left early except for Chu, who lived nearby and decided to stay a bit longer. As he watched the snow drift outside, he realized they'd been approaching the problem incorrectly: He first needed to cool the atoms, so that they were moving only as fast as ants, rather than fighter jets; only then could he predict their movements and trap them with lasers. It was a key insight, and Chu's subsequent work on cooling atoms eventually earned him a share of the Nobel Prize in physics.

While it may sound inevitable in retrospect, big breakthroughs like that don't come along too often. Nowadays, though, Chu is betting that they will-- and must. As the U.S. energy secretary, Chu has been tasked with reshaping the country's trillion-dollar energy economy, to reduce America's reliance on fossil fuels and cut greenhouse-gas emissions 80 percent or more by mid-century- -essential to avoiding catastrophic climate change. It's an enormous goal, and Chu believes the only way to achieve it is with multiple Nobel-caliber leaps in energy technology. "I mean technology that is game-changing, as opposed to merely incremental," he told Congress in March--technology that, as a recent Department of Energy (DOE) task force described it, will require an understanding of basic physics and chemistry "beyond our present reach."

Not everyone agrees that the fate of the planet hinges on such far-reaching advances. Given the increasingly dire revelations of climate science-- greenhouse-gas emissions are rising more quickly than projected, Arctic sea ice is melting faster than once thought possible--many analysts believe we need to act with all due haste, before the planet hits a tipping point. And much of the green movement tends to follow Al Gore's oft-stated assurance that, "to save the future, we have everything we need except the political will." Chu's view also sits uneasily alongside that of the U.N. Intergovernmental Panel on Climate Change (ipcc), which, in 2007, declared that stabilizing carbon in the atmosphere below safe levels could be done with a mix of technologies that were either currently available (like wind power) or just around the bend (plug-in hybrids, for instance). This camp argues that, while existing clean-energy gadgets may have flaws, they'll get steadily better and cheaper as they're deployed, and, in any case, the immense urgency of the climate problem leaves us little alternative.

Chu, for his part, acknowledges that the threat posed by global warming requires a wide array of practical and immediate responses, including a cap on carbon emissions to make fossil fuels less competitive. Still, according to those familiar with his thinking, Chu sees the challenge of spurring breakthroughs in energy research as an even larger priority than setting a carbon price and promoting existing clean-energy sources. "I think science and technology can generate much better choices," he told The New York Times in February. "It has, consistently, over hundreds and hundreds of years."

It's an attractive vision. There's no question, after all, that investments in energy R&D have become dangerously anemic over the years, and that better technology would make the task of curbing emissions far less arduous. But Chu's stance raises plenty of knotty questions. What's the best way to promote innovation? And can these vast leaps in scientific understanding arrive in time to fend off rising global temperatures? And, most importantly, what happens if they don't? As it turns out, there's reason to worry that counting on radical breakthroughs to save the planet may be a fraught and uncertain venture.

Talk to chemists and physicists toiling on the frontiers of clean energy research and it's easy to be dazzled by the possibilities. Take solar power. The sun radiates more energy down to Earth in an hour than humans use in a year. It's just that, right now, even the best solar photovoltaic panels are relatively inefficient at harvesting that resource--and storing the energy for when the sun isn't shining remains tricky. So, some scientists have gone back to the drawing board and tried to pursue artificial photosynthesis, mimicking the deft process that plants use to turn sunlight and water into fuel.

Progress has been slow, but, in 2008, Daniel Nocera, a chemist at MIT, made headlines for a breakthrough in one small piece of this project. He created a cheap chemical catalyst that could use solar energy to break down water into oxygen and hydrogen, which could then be stored or run through a fuel cell to generate electricity. "In theory, if you could use the sun's energy to convert just one-third of the amount of water in an Olympic-sized swimming pool into hydrogen and oxygen per second, you could take care of the world's energy needs, " Nocera explains. At the time it was announced, one researcher called it "probably the most important single discovery of the century" for solar power, and the news conjured up shiny visions of leaf-like catalysts creating hydrogen fuel to power homes and cars--provided, of course, that a flurry of follow-up problems could be resolved: how to absorb sunlight more efficiently, how to build the fuel cells, how to store the hydrogen cheaply, and so on. Those hurdles are formidable, and Nocera has his fair share of doubters, but the broad concept is undeniably appealing.

Over the past few years, the DOE's Office of Basic Energy Sciences has released a series of reports that have taken such "if only..." notions quite seriously. The premise of these reports is that existing clean-energy technologies--from batteries to nuclear power--have hit ceilings that can't be surpassed by engineering alone or by the tinkering that private companies are now doing at the margins. In many cases, cracking these limits will require dramatic strides in scientific understanding. "Companies are doing great work, but it's all been based on an existing set of basic concepts and understandings to which we need to continue to add," says Mark Ratner, a chemist at Northwestern University who contributed to the DOE reports.

These next-level breakthroughs may, in effect, require a whole new type of science. Ratner explains that, during the twentieth century, scientists became increasingly sophisticated at studying quantum effects and examining matter at an ever-smaller scale--leading to the discovery of key materials such as superconductors and carbon nanotubes. Now, he says, the big challenge is to step beyond observation and figure out how to actually control and direct matter and energy at the atomic level. "We still don't really know how to control chemical reactions at that level, for instance," he says. "Something like that could really deepen what we could do."

Last December, the office released an "energy challenges" report that offered a lavish vision of what a new "control science" could accomplish. If, for instance, the steel used to make nuclear reactors was built by manipulating atoms at the nanoscale, rather than through traditional bulk processes, we could have materials that self-heal and better resist chemical corrosion and intense radiation, allowing the construction of nuclear reactors that operate at much higher temperatures and efficiencies--meaning more power and less waste. Or ultra-light materials could be used to build cars that require far less energy to propel. Or batteries built on chemistry yet unknown could allow electric vehicles to vastly surpass their currently limited range. Or solid- state lighting that used just a fraction of the power that incandescent light bulbs use, and without the glare of compact fluorescent lights, could drop the percentage of electricity the nation needs for lighting from 22 to 2 percent.

While the DOE reports were drafted during the Bush years--conservatives have often touted technological upheaval as the solution to climate change, but mainly as an excuse to put off carbon regulations--the findings have particularly resonated with Chu, who has long agreed that deep research and disruptive new technologies are necessary for solving our energy woes. With funds pouring in from the Obama administration, the DOE's labs are beginning to channel their basic research efforts toward specific energy problems, both big and small. "There's always a question of how much you focus on engineering and how much on targeted, basic research," says George Crabtree, a scientist at Argonne National Laboratory who helped oversee the DOE reports. "It's dangerous to say this, because I love engineering-- the plug-in hybrid, that was engineering. But we need to shift back a bit to the basic sciences side."

As alluring as this scientific vision may be, we may not have time to wait for radical advances to emerge. Global warming is happening now, and many of the changes needed to slow or stop it will have to be made in the next few decades. According to the ipcc, wealthy countries need to cut their carbon- dioxide emissions by up to 40 percent below 1990 levels by 2020, and by more than 80 percent by 2050, in order to have a shot at stabilizing atmospheric carbon levels below 450 parts per million. Skate past that number and we run a high risk of triggering carbon-cycle feedbacks beyond our control, putting the world on course for massive sea-level rises, widespread drought, and other assorted horrors.

"We need to replace the entire energy system in three or four decades," says Joseph Romm, a Clinton-era Energy Department official and one of the fiercest critics of those banking on major technological breakthroughs. "The notion that technology that doesn't even exist yet could be invented, demonstrated, and then commercialized in that time frame--it's absurd."

Indeed, the question of timing turns out to be utterly central. Chu is mindful of this: In addition to his longer-term vision, he has put an emphasis on intermediate advances that can scale up in just five to ten years with proper support--from improved batteries to sensor-based technology to cut energy use in buildings. Yet many of the truly sweeping breakthroughs envisioned by the DOE task force could take far longer, and the energy industry tends to evolve at a sluggish pace, with major new discoveries often taking decades to filter into the marketplace.

Consider, for instance, the high-temperature superconductor. In 1986, two researchers at IBM's labs in Zurich, Alex Muller and Georg Bednorz, announced that they had discovered a new brittle ceramic compound that could conduct electricity with little resistance at relatively high temperatures (previous superconductors needed to be cooled with expensive liquid helium, making them impractical for most uses except boutique applications, such as MRIs and particle accelerators). Condensed-matter physicists everywhere swooned over the news. The press giddied itself over a "Jetsons"-esque future replete with magnetically levitated trains, ultrafast computers, and new transmission lines that could carry five times as much electricity as normal lines over long distances with low losses--a true game-changer for electricity markets. "That summer, I made a lot of money as a consultant," chuckles Bruce Strauss, a high- energy physicist who now works for DOE. "You saw venture capitalists running around with open checkbooks, asking, 'Where should I invest?' I kept telling them you're not going to see real applications for twenty or thirty years--they looked at me like I was a doomsayer." Sure enough, more than 20 years later, those transmission lines are still in the embryonic stage. The superconductors perform admirably in demonstration projects, but no one has mastered a way to bring production costs down to halfway reasonable levels.

While counterexamples exist, some analysts warn that the superconductor case is all too illustrative. Romm cites a 2001 study by Royal Dutch/Shell Group, which found that, historically, "it has taken 25 years after commercial introduction for a primary energy form to obtain a 1 percent share of the global market." Silicon solar photovoltaic cells, for instance, ranked as a major advance when they were developed more than half a century ago, yet they still provide less than 0.1 percent of America's electricity today. How much longer might it take for even more complex inventions--artificial photosynthesis, say--to journey from the laboratory out into widespread use? (Even Chu recognizes that some inventions may not emerge in a reasonable time frame: He recently cut federal support for hydrogen-powered cars, arguing that widely deploying these vehicles in the next few decades would require "four significant technological breakthroughs"--too improbable for even the most fervent believers in science.)

There may be an even more basic problem, too: Science has to continue rapidly advancing--or even speed up--for many of these big breakthroughs to occur in the first place. Last year's DOE report argued that achieving its futuristic vision would require significantly increasing the rate of scientific discovery, in part because, as the materials and phenomena under study get more complex, entirely new scientific laws often need to be grasped at each step of the way. "Look at superconductivity," explains Graham Fleming, a chemist at the University of California, Berkeley who contributed to the DOE reports. "It's got what's called an emergent property--just knowing the properties of the parts doesn't help a lot. Having a better theoretical understanding of those emergent systems is something we need. Right now there's an awful lot of trial and error in making materials that are much more efficient and effective."

So what are the odds that the pace of scientific discovery can accelerate? This is a hotly debated topic, with a wide spectrum of answers. In 2005, Jonathan Huebner, a physicist working at Pentagon's Naval Air Warfare Center in China Lake, California, published a controversial paper looking at the rate of U.S. patents awarded over time. He argued that the rate of technological innovation has actually been slowing since around 1873--the age of Edison--and that the world could be entering a new Dark Age of innovation by the 2020s. Huebner's paper came under heavy fire from a number of innovation scholars, including Ray Kurzweil, a futurist who is inordinately bullish on the rate of technological progress and who, to his credit, has made a number of bang-on forecasts. (He famously predicted that a computer would be able to beat a grandmaster in chess by the late 1990s.) Kurzweil has suggested that technological change will continue zooming forward until artificial- intelligence machines start replicating themselves. At that point--in or around 2045--the pace of innovation will happen so blindingly fast as to be inconceivable. Presumably the answer to our carbon troubles would materialize, as if by magic, at this point.

Most scientists, though, tend to sit in the middle: optimistic that science and technology will continue to push ahead, but not gambling on any particular game-changer. "You can't make those predictions," says Jae Edmonds, a scientist at the Joint Global Change Research Institute in College Park, Maryland. "What I do know is that you increase the likelihood of getting breakthroughs if you both invest in the underlying science and have policies that signal a limit on emissions. If you do that, it's reasonable to expect the technology to come."

That brings up another big question mark: Is there anything the government can do to quicken the pace of innovation, and convert those laboratory insights into commercial reality more swiftly? It's easy to point to the Manhattan Project or Apollo Program as examples where the government faced a seemingly intractable scientific problem, opened its checkbook, unleashed the nation's brightest minds, and watched the obstacles melt away. But that analogy is inapt. Michael Oppenheimer, a geoscientist at Princeton, has pointed out that the Manhattan Project was set up to create a unique product for a single customer-- the U.S. government--for whom money was no object. The government can't pursue the same strategy with energy, because price is a pivotal concern.

On this score, the government's record is murkier. Regulations like the Clean Air Act and fuel-economy standards have helped prod private companies to innovate, but shepherding major scientific advances from birth to commercial readiness is another matter entirely, and a difficult task. "It's a complicated, messy, recursive process," says Robert Fri, a former Environmental Protection Agency official who now works at the think tank Resources for the Future. "That's not always the way the government goes about doing stuff." Fri points to the Energy Department's synthetic fuels program in the 1970s, which struggled to invent a viable alternative to oil and fell apart when the price of crude sank back down. To be sure, Chu is aggressively pursuing new ways for the Energy Department to bridge this gap between basic science and the marketplace. He has read, among other things, a recent Brookings report arguing for closer collaboration among DOE's national labs, universities, and private industry, and he is trying to retool the Energy Department along similar lines. Those efforts are promising, but the results remain to be seen.

That doesn't mean it's time to despair. It may be difficult to translate Nobel-caliber insights into everyday household gadgets. But there's another job that the government has been fairly good at doing over the years: supporting technologies that are already viable but, for whatever reason, face barriers to gaining a toehold in the real world. "Historically, when the government has thought through what those barriers were, and targeted that barrier quite precisely, it could have a terrific positive impact," says Fri. In the 1970s, he notes, Energy Department labs realized that using electronic ballasts in fluorescent light bulbs could make the bulbs vastly more energy-efficient. The big roadblock was that, at the time, two ballast manufacturers dominated the market, and neither had any incentive to adopt the new technology, since if one did, the other would quickly follow, and no one would gain any market advantage- -they'd just have spent a bit more money. So the Energy Department seeded a few small start-up firms that demonstrated the ballasts and prompted the bigger companies to follow suit, leading to large savings for the public.

These deployment strategies, it turns out, can spur another type of innovation--marketplace learning--which can sometimes be just as crucial as cutting-edge scientific research. "There are three ways of lowering the cost of something," explains Romm. "There's a major technological advance; there are economies of scale; and then there's what's called the experience learning curve. The third one is arguably the most important--as you deploy stuff, you realize what needs to be improved. Where the stress fractures in a wind turbine are. How the mirrors [of a solar thermal plant] perform in a desert after a few months. Or in manufacturing--how you use silicon more efficiently, or make it thinner, or recycle it. None of this is Nobel-caliber." As existing technologies like wind turbines become more widely used, performance steadily improves, incremental research gets adopted, and the costs start falling--a process that can be more reliable than waiting for breathtaking new scientific advances. Studies of wind turbines, for example, have found that costs come down by as much as 17 percent each time production doubles, as a result of learning-by-doing.

That's a good thing, because it's this sort of incremental innovation that the planet may need to lean on most heavily, at least in the near term. A substantial body of evidence suggests that the world can make huge emission cuts in the next few decades without needing to wait for grandiose new technologies to arrive. In 2004, two Princeton scientists, Stephen Pacala and Robert Socolow, published a much-discussed paper in Science laying out 15 carbon-cutting strategies that have already been field-tested, which they dubbed "wedges." Solar power constituted one wedge, nuclear another, stricter fuel-economy standards for cars a third, reversing deforestation a fourth, and so on. The world, Pacala and Socolow showed, could cut emissions dramatically and still satisfy growing energy demand by deploying just seven of those wedges on a grand scale. (That's no mean task: One "wedge" of nuclear power, say, would mean building 21 new plants each year between now and 2050.) While some experts now believe we need even more wedges to meet the ipcc targets, several research efforts have reached similar conclusions: By combining everything we now have on hand, we can tackle a sizeable chunk of the problem at a reasonable cost. A 2008 McKinsey Global Institute analysis pegged the investments needed by 2030 at around 0.6 to 1.4 percent of global GDP and noted that the impact on the world's economic growth rate would be minimal, "even with currently known technologies."

The IPCC lists a wide array of technologies either available now or very likely to surface in the near future that could help reduce emissions. The former category includes (among other things) nuclear power, hydropower, wind power, geothermal, fuel-efficient vehicles, hybrids, biofuels, public transit, reforestation, and landfill methane recovery. The list of just-around-the- corner strategies includes carbon sequestration for coal-fired plants, advanced nuclear plants, tidal and wave power, cellulosic biofuels, and advanced electric vehicles. While no single solution will make more than a modest dent in the world's emissions--none of those items has the vast potential of, say, artificial photosynthesis--when combined together, they should steadily chip away at carbon emissions.

One particularly promising example is solar thermal power, in which large arrays of mirrors heat a liquid that can either generate electricity or be stored for when it's cloudy or dark. Such plants are being built now, and, if Congress passed a carbon price that made them more competitive with coal and natural-gas plants, they'd become even more prevalent and the price would continue nudging downward: The National Renewable Energy Laboratory has already estimated that these plants will be able to compete with existing natural-gas plants by 2015, and solar-power expert Ken Zweibel has estimated that they could, in theory, supply up to 69 percent of America's electricity by 2050. It would require a fair amount of government support and infrastructure investments, as well as further R&D, but not a whole new level of science. In the short term, meanwhile, one of the biggest reductions in emissions will come from making buildings and homes more energy-efficient, which typically requires no fancy new technology at all--just smarter regulations.

It's true that, eventually, existing technologies will hit a wall. If electric cars powered by lithium-ion batteries become widespread, for instance, then world lithium supplies could conceivably dwindle. Or, at a certain point, the need for cost-effective electrical-power storage could become a serious issue. (Right now, Denmark gets nearly 20 percent of its electricity from wind, but can only "store" that power by exporting the excess to Sweden and Norway, which send back hydropower when the wind isn't blowing-- a clever workaround that will prove less feasible as renewables spread.) "There are a bunch of things we know how to do now, and we want to use all the incentives we can to get those things out there," says Argonne's George Crabtree. "But, eventually, their impact will saturate, as soon as they're fully deployed--and that's where you'll need serious innovation."

By mid-century or so, we really may need the sorts of artificial leaves and futuristic batteries we can barely begin to imagine right now. And it is far less outlandish to expect that, by that time, science will be able to deliver such breakthroughs. But all of those sci-fi technologies decades from now won't mean much if we've long since blown past our carbon budget. That means doing as much as we can now--and fast.