You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Welcome (Back) to the Nuclear Age

Americans haven’t thought seriously about the threat of nuclear war in a long time. Now, as climate change accelerates, the peril and promise of nuclear technology feels more fraught than ever.

Russian President Vladimir Putin, surrounded by top military officers and officials, tours a military flight test center.
ALEXEY NIKOLSKY/AFP/Getty Images
Russian President Vladimir Putin, surrounded by top military officers and officials, tours a military flight test center.

In an era defined by climate change and the Covid-19 pandemic, one apocalyptic potential fell off many Americans’ radars: nuclear weapons. But in the bitter February twilight, Russian President Vladimir Putin put them squarely back in the center of discourse when he invaded Ukraine—and appeared to threaten anyone who intervened with radioactive retribution.

“They must know that Russia will respond immediately, and the consequences will be such as you have never seen in your entire history,” Putin said in a public address. In the same speech, he added: “Today’s Russia remains one of the most powerful nuclear states.”

Despite the aggressive posture, many analysts believe that nuclear war
remains unlikely, given that it would be devastating for everyone on earth, Russians included. But the possibility of nuclear escalation has always been unusually hard to parse. While climate catastrophe is now guaranteed if we don’t act now to curb greenhouse gas emissions, and Covid-19 transmission can be modeled to manage risks, nuclear warfare has always existed on its own strange continuum between ludicrous and imminent.

Such disastrous potential has been clear since 5:29 a.m. on July 16, 1945, the moment Manhattan Project scientists detonated the first atomic bomb, code name Trinity, at a test site in the white sands of New Mexico. But recent events have pulled this history back into the headlines. “It’s a horrible version of déjà vu,” said Kate Brown, professor of science, technology, and society at MIT and the author of several books on the Cold War and its nuclear legacy.

For many people with intimate ties to irradiated landscapes, however, the atomic age never stopped. While the United States, Russia, the United Kingdom, France, China, India, Pakistan, North Korea, and Israel possess or are strongly suspected of possessing nuclear weapons capability that could reshape our future, dozens of countries already contain contaminated sites. In addition to the two atomic weapons the U.S. dropped on Japan, at least 2,400 atomic weapons have since been detonated, first above ground and later below, as part of widespread nuclear testing in the postwar period.

For many theorists, that makes radiation an important marker of human influence on the planet. In 2016, a group at the International Geological Congress presented possible start dates for the Anthropocene, the unofficial name for our current human-dominated geological age. There were many candidate markers, including plastic trash, pesticides, concrete, and domestic chicken bones. “We are spoiled for choice,” one geologist told The Guardian. But the single clearest signal they found was the marked increase in radionuclides starting around 1950, the enduring consequence of those tests.

From Japan to New Mexico to French Polynesia, these already-irradiated spaces provide a kind of keyhole into our possible futures. While their fallout has decayed over time so that many sites no longer present an active threat to human health, the memories linger. In these communities and others like them, apocalyptic potentials don’t erase each other but accumulate, like the effects of extended radiation exposure in the body or the continuous release of carbon into the atmosphere.

Since the 1990s, threats of nuclear war were largely placed on pause, thanks in part to the global anti-nuclear movement. In the U.S., the movement brought together churches, labor unions, and other left-wing activists in the fight against nuclear weapons proliferation and, just as often, nuclear energy generation and uranium mining. They filed lawsuits to disrupt the administrative procedures that license new nuclear power plants, put (failed) propositions on state-level ballots, and took to the streets. In June 1982, as many as one million people gathered in New York City’s Central Park in the largest disarmament protest in history.

As the twentieth century drew to a close, anti-nuclear campaigns bore fruit. While many might attribute deescalation to the fall of the Soviet Union, the process began a few years earlier and ultimately encircled the globe. In 1987, the U.S. and USSR signed the Intermediate-Range Nuclear Forces Treaty, the first time the nations agreed to reduce their nuclear stockpiles. In 1995, parties to the original Treaty on the Non-Proliferation of Nuclear Weapons, which aimed to prevent the spread of weapons to nonnuclear nations, met to extend its terms indefinitely. And in 1996, the United Nations brought forward the Comprehensive Nuclear-Test-Ban Treaty, which eliminated above- and below-ground weapons testing (at least among signatories) and with it a major source of radiation.

None of these policies were perfect; many remain unenforced. But the result was a relative stasis as the risk of nuclear warfare appeared to be in retreat. That sense of safety was periodically punctured by news of Iran’s interest in developing its nuclear program or Donald Trump’s “Little Rocket Man” tweets directed at North Korean leader Kim Jong Un. And it was undermined by spending statistics: In 2019, for example, the U.S. spent a whopping $37.4 billion on its nuclear weapons, according to the Nobel Peace Prize–winning organization the International Campaign to Abolish Nuclear Weapons. Yet for many Americans, the many risks of nuclear armaments have fallen behind other, seemingly more pressing concerns.

But Putin’s recent actions in Ukraine have not only resurrected Cold War anxieties. They’ve hastened the realization that the neoliberal world order—premised on international cooperation and geopolitical stability through open markets, shared security, and transnational institutions—was always something of a think tank–promulgated fiction. That comforting myth has left most Americans, including many politicians and pundits, decades out of practice in thinking through nuclear quandaries of all kinds, if they ever had that skill in the first place.

Rebuilding that muscle will take effort. Managing the global nuclear threats that exist today requires a dizzying blend of military strategy, foreign policy, and diplomacy, as well as environmental and health science. But there’s also a moral clarity to the absolute need to prevent nuclear conflict, as lessons from the recent past make clear.

Today the Hibakusha, the Japanese word for people affected by the atomic bombings of Nagasaki and Hiroshima, still tirelessly share their stories of the real and lasting damage of nuclear warfare. Almost 80 years out from the first mushroom cloud, their numbers are shrinking, but there is always a community ready to pick up where they left off. After the French colonial government no longer found Algeria suitable for its nuclear tests, for example, it moved its operations to French Polynesia, where bombs were routinely detonated from 1966 to 1996. Over a hundred thousand people are estimated to have been affected by the resulting contamination.

Perhaps one of the biggest challenges we face is thinking through the entanglement between nuclear weapons manufacturing and nuclear energy production. While the U.S. has consistently fretted over the possibility of foreign governments refining nuclear energy sources into nuclear weapons, perhaps the bigger risk posed by nuclear energy is much more mundane—and more urgent than ever.

Nuclear power, a low-carbon energy alternative, is being sold as an essential part of climate action. Some see existing plants as a bridge from the fossil fuel economy to other renewables, while others, like Bill Gates, who owns a reactor design company called TerraPower, view the next generation of nuclear energy as the sector’s long-term future. Yet the consequences of existing global warming are, ironically, raising serious concerns about the future viability of the technology.

Rising temperatures have already been enough to shut down nuclear energy production, calling its reliability into question. In the near future, rising waters could trigger the kind of meltdown seen at Fukushima in the wake of a 2011 storm surge that flooded the plant and disrupted the emergency generators that keep the nuclear cores cool. To remain safe and productive, “nuclear power plants need centuries of money and peace and prosperity,” Brown told me. A week ago, many still believed that stability was ours to give. Increasingly, it feels like a promise already broken.

Or at least that’s what my own atomic childhood taught me. I grew up on the leaking edge of the Hanford Nuclear Reservation, often called the “Most Toxic Place in America.” Hanford, or “the area,” as we called it, was the world’s first full-scale plutonium reactor. Its isotopes made the Trinity test weapon and the bomb dropped on Nagasaki possible. Now an estimated 56 million gallons of toxic waste is barely contained beneath the shrub-steppe surface and inching ever closer to the nearby Columbia River. But in 1984, a newer reactor, known as the Columbia Generating Station, opened on the site as the weapons-grade reactors were being decommissioned. Today, it supplies about 10 percent of Washington State’s electricity, which is known for both the low price of its energy and its relatively low-carbon energy mix.

Even in these relatively peaceful, post–Cold War years, I was acutely aware of the contradictions Hanford embodied, a power that in one form was spectacularly generative and, in another, uniquely able to extinguish; a true marvel of human ingenuity yet always slipping from our control; and an empirical phenomenon mired in decades of propaganda.

Nothing about my proximity to Hanford made me any kind of nuclear expert, but even as a child, the stark juxtaposition did make me wonder—about the choices we made and the choices we could make in the future. About how we weigh possible gains against an existential risk. It’s a kind of pondering that many in my generation who grew up farther from these sites are likely asking themselves for the first time, now with the added complexity of decarbonization. But as a quick glance at a map of the world’s many nuclear landscapes will show, many people have walked these roads before, and they know a little bit about where they lead.