Inside Chernobyl. (Patrick Landmann / Getty Images)
Things aren’t looking good for the Antarctic ice sheet. Out of sight to most of us, it’s melting faster than ever. Scientists now struggle to find floes thick enough for their instruments. Meanwhile, increased rainfall on Greenland’s ice sheet is creating fresh problems. Wet snow absorbs more heat, which speeds up the melt and pushes sea levels higher. Some say we have only a decade left to avoid warming on a life-threatening scale.
It didn’t have to be this way. Long before climate change became widely recognised, a zero-carbon energy source was at hand: nuclear power. Instead of burning molecules, a reactor splits atoms, so no carbon is released. And because a nuclear plant runs day and night, it can compete directly with polluting coal- and gas-fired power stations.
Yet more than 70 years after its debut, nuclear power still hasn’t exactly taken off. Only about 30 countries produce it today. And despite much talk of a “nuclear renaissance”, with fevered talk of floating power plants and better supply chains, its share in the global electricity mix has fallen sharply since climate change rose on the public agenda in the Nineties — from a peak of 17% in 1997 to less than 10% today. Since 2000, more plants have closed than opened.
The dominant narrative about nuclear power now is that it’s an outdated technology we would be better off without. In both Finland and France, building a single reactor took a staggering 17 years — not even counting the time spent securing permits. Both of these “prestige projects” went far over budget.
And so, with the planet heating up, it’s no wonder many experts and politicians argue we can’t afford to wait for the nuclear industry to catch up. Despite the obvious benefits, they dismiss nuclear energy as a classic case of too little, too late. Maybe they’re right. But the problem goes deeper. Nuclear power might, in fact, be a case of too much, too soon.
***
When atomic power entered the scene in the Fifties, the future of energy looked bleak. Across much of Europe, the best coal mines were largely exhausted, and almost every suitable site for a hydroelectric plant had already been used. With the discovery of vast oil fields in Saudi Arabia, a new fuel for the future seemed to have arrived. But oil, too, carried its own risks. When Egypt closed the Suez Canal in 1956, oil supplies suddenly stalled.
Other concerns came from scientists who cautiously warned that the Earth was warming. In 1956, the opening sentence of a newspaper article could hardly have been clearer: “The burning of fossil fuels,” it says, “may have very unpleasant consequences for the global climate in about 50 years’ time.” Two years later, a young postgraduate geochemist named Charles David Keeling set up a measuring station for atmospheric carbon dioxide on a Hawaiian volcano, showing that higher carbon concentrations coincided with higher temperatures.
Thankfully, there was nuclear energy. With nuclear reactors, a country could drastically reduce its dependence on coal, oil and natural gas. Even better, geologists had shown that the resource used in reactors — uranium — was abundantly available. And even though carbon emissions were not yet on the public’s mind, nuclear plants tackled the more pressing problem of smoke and smog. Had humanity finally found an unlimited source of clean energy?
No wonder nuclear power was heavily promoted. After the war, the global economy was swiftly reordered, sparking rapid recovery. In factories, automation boosted productivity. On the land, fertilisers and synthetic pesticides delivered higher yields. Economies boomed, unemployment fell, and wages rose. For more people than ever, life became more comfortable. Nothing captured this leap into modernity better than the enigmatic appeal of nuclear power.
Just imagine: in countries facing drought, nuclear plants could power desalination, ending conflicts over scarce water. They could also drive large-scale fertiliser production, ensuring fields remained fertile. And nuclear energy promised a revolution in transport, supplying the power for trains, ships, planes and even rockets. The possibilities seemed endless.
The words of Lewis Strauss, a leading figure in all things nuclear, became legendary after a speech, in which he declared:
“It is not too much to expect that our children will enjoy in their homes electrical energy too cheap to meter, will know of great periodic regional famines in the world only as matters of history, will travel effortlessly… and will experience a life span longer than ours.”
There was only one catch: the very same technology that powered these new plants was also used to produce the world’s deadliest weapons. During the Second World War, scientists and the US military worked together to turn nuclear fission into a weapon: the atomic bomb. Near the end of the war, it was dropped on Hiroshima and Nagasaki. A single bomb had never caused such destruction. Governments everywhere were quick to see the bomb’s strategic value. By 1949, the Soviet Union had carried out its first atomic test. With the “red menace” stretching from Berlin to Beijing, tensions soared between the communist East and the capitalist West. Both superpowers now possessed a superweapon shrouded in secrecy.
These were anxious times. Never before had humanity held a weapon capable of destroying itself. If a government leader were to launch such an attack and trigger nuclear war, smoke and dust from burning cities would rise into the atmosphere and clump together, blocking sunlight. Rain would cease to fall. Earth’s temperature would plunge to ice-age levels. Harvests would collapse, and for the survivors, famine would be inevitable.
It was a former military officer — Dwight D. Eisenhower — who offered a way out of this conundrum. Having been commander-in-chief of the D-Day invasion in 1944, the memory of the Second World War was never far from his mind, nor was the looming threat of World War III. When he became president in 1953, Eisenhower worried that society was being paralysed by fear of total annihilation. That same year, he delivered a speech at the United Nations General Assembly that would become a turning point in history.
Eisenhower described the immense explosive power of the US nuclear stockpile. How to build such weapons was no longer a secret. This, he warned, was a “danger shared by all”. Then the US president signalled his willingness to scale back the weapons arsenal. He wanted to take nuclear technology out of military hands and put it to peaceful use — in medicine, agriculture and, above all, energy supply. “The miraculous inventiveness of man,” he said, “shall not be dedicated to his death, but consecrated to his life.”
In the wake of Eisenhower’s speech, soon known as “Atoms for Peace”, Washington launched a major charm offensive to showcase peaceful applications of nuclear power. Brochures, posters, films, comics and children’s books spread the message.
It worked. Within a few years, dozens of countries had signed contracts with the United States for reactor construction or the supply of enriched uranium.
Nuclear power advanced rapidly. Its heyday came in the Sixties, when plants were still relatively cheap to build and kept getting larger. No other energy source had ever gained such a large share of the market so quickly. If anything, the future looked nuclear.
***
Not so fast. While public opinion on nuclear power was warming, not everyone joined the chorus. Critics dismissed the utopian dreams. To some, nuclear fission smacked of hubris.
Above all, the shadow of the atomic bomb loomed large. As the Cold War deepened, more and more people realised that nuclear plants were tied far more closely to nuclear weapons than officials had admitted. The evidence was plain: in the United States, the Atomic Energy Commission (AEC) oversaw both power plants and bomb facilities. The basic physics was the same, and so were many of the people involved.
Although the number of plants surged after the Atoms for Peace speech, the boom was overshadowed by the spread of nuclear weapons. By the time Eisenhower left office eight years later, the global stockpile had jumped from just over 800 warheads to nearly 20,000.
A simple logic took hold: more nuclear power meant more nuclear weapons. After all, the countries that owned the bombs had built the very industry that produced them. Open the door to nuclear plants, and it seemed only a small step to all-out war.

From the start, this connection was clear. In the early 1900s, scientists discovered that enormous amounts of energy were somehow locked inside the atom. Nuclear physics began to take shape as a new discipline once Ernest Rutherford and Frederick Soddy demonstrated the transmutation of atoms — a natural process of radioactive decay that released energy. Some pioneers dreamed that this hidden power could bring a clean and prosperous world within reach. But they also imagined something darker: if the energy inside the atom could be freed, it might unleash an explosion — a huge explosion. In 1903, Rutherford noted: “Some fool in a laboratory might blow up the universe unawares.” The following year, in a lecture to a British Army corps, Soddy warned that whoever managed to break open the atom and harness its energy would possess “a weapon by which he could destroy the earth if he chose”.
The idea quickly moved beyond academia and into the public imagination. In 1913, H.G. Wells, then the most popular science fiction writer, published The World Set Free, a novel envisioning a mid-20th-century war in which both sides wielded a new radioactive weapon of unprecedented power. Wells even gave it a name: the “atomic bomb”.
Fiction became reality. In 1945, the atomic bomb proved it could wipe out an entire city. Soon, nuclear weapons testing became a cause for alarm. Scientists issued warnings: even minor fallout could enter the food chain and weaken human bodies. Some predicted that exposure to radioactive particles might trigger an epidemic of cancer and genetic mutations.
Only decades earlier, the picture had been very different. Exaggerated fantasies of what radiation might do were a staple of books, comics and films. Exposure bestowed magical powers. In the Twenties and Thirties, it was widely believed that radiation enhanced health. Radium became a symbol of vitality. Doctors prescribed it for ailments ranging from heart disease and high blood pressure to arthritis, epilepsy, headaches, diabetes and rheumatism — practically the entire medical encyclopaedia. A craze followed. Radioactive ingredients appeared in soaps, salves, bath salts, face creams and hair tonics. You could snack on radioactive chocolate and then brush your teeth with toothpaste laced with thorium, for a sparkling smile.
But it wasn’t all magic. Victims of overexposure felt weak, developed cataracts, lost their hair or became temporarily infertile. The worst damage appeared in doctors and assistants who experimented lavishly without protection. Some tested X-ray equipment by holding their arm in front of the screen. While many of their patients benefited from diagnostic radiation, the experimenters themselves suffered terribly, as it became clear that high doses of radiation damage DNA and cause cancer.
Some comic books cast radiation as a blessing, granting superheroes extraordinary powers. Others portrayed it as a curse, unleashing monsters with deadly rays. This ambivalence was embodied in Superman: his X-ray vision let him see through walls, yet he was powerless against kryptonite’s radiation.
After Hiroshima, the atomic bomb brought these half-buried fears to life. Americans realised that while they could not prevent a nuclear war, they might at least stop nuclear tests. Beginning in the Fifties, protests filled the streets demanding a ban. Eventually, above-ground tests were halted, eliminating most fallout. Yet even as the arms race accelerated beneath the surface, public protests faded. But the fears never fully disappeared.
So what happened? As historian Spencer Weart argued in Nuclear Fear: A History of Images, public attention shifted — from nuclear weapons to nuclear plants. Doesn’t a reactor give off radiation too? What if an accident were to occur? Would a nuclear plant explode like an atomic bomb? And what about the waste? Such questions soon turned into a new goal: banning nuclear power altogether.
Developed in a time of growing distrust of technology and contempt for science, nuclear power became the symbol of needless complexity. There must be an easier way to get energy than splitting unstable atoms of enriched uranium. Nuclear plants were enclosed by fences and warning signs, shrouded in secrecy and danger. Surely, this is not what progress looks like.
Today, supporters of nuclear energy point out that accidents are so rare that nuclear power is as safe as solar and wind. They stress that only a minuscule amount of uranium is needed. They cite calculations showing that all the highly radioactive waste produced worldwide since the Fifties would fit into a single football stadium.
And that murky boundary with nuclear weapons production? For decades it has been tightly monitored by the International Atomic Energy Agency, which even received a Nobel Peace Prize for its efforts. Meanwhile, the global stockpile of nuclear weapons has dropped from a peak of 70,000 in the Eighties to 12,000 today.
Critics counter that the shrinking size of the arsenal is irrelevant, since leaders like Vladimir Putin or Kim Jong Un need only one bomb. And while the volume of nuclear waste may be small, it doesn’t appear to be safe when the industry buries it deep.
Then there is the cost. A blizzard of regulations — combined with governments that cancel projects midway or shut down reactors prematurely — has made building a nuclear plant exorbitantly expensive and painfully slow.
This has kept us in a stalemate for decades, with no end in sight. Since the mid-Eighties, the combined share of zero-carbon energy to the global electricity mix has risen only marginally, while the fossil share has grown slightly more. As power consumption continues to rise, producing enough clean electricity will be a formidable challenge.
Leading organisations of climate scientists and energy experts are unambiguous about the role of nuclear power. The IPCC lists it as one of the “technology solutions”. The UNECE argues that climate targets cannot be met without scaling it up.
But who’s listening? Not the politicians who have dedicated themselves to environmentalism or socialism and worked tirelessly to restrict, regulate and reduce the development of nuclear power. And certainly not the campaigners at legacy organisations such as Greenpeace and Friends of the Earth that have warned about nuclear power for generations. Dropping old ideas is never easy.
And while, among the general public, acceptance of nuclear power is growing, modern liberal societies are hardly the most fertile ground for expanding the nuclear fleet. In densely populated Europe, with its strong democracies, every proposed site for a new plant is bound to face public protest.
For now, the so-called “nuclear renaissance” rests on promises and pledges — words without substance. So far, there is little sign that we can muster the will to build, keep and nurture nuclear plants.
***
Saying nuclear power is not a quick fix goes back decades. Already 50 years ago — long before accidents drove up costs and tightened regulation — experts warned it would be “too late” to meet rising energy demand. It would take at least 25 years to build enough plants to end the crisis. One newspaper even called it “a classic case history of technology that just didn’t make it on time”.

Yet a closer look at history suggests the opposite: nuclear power was always miles ahead of society itself. For one thing, radioactivity was discovered when many doctors still didn’t wash their hands. For another, the atom bomb was dropped in a world where Polish soldiers had recently faced German tanks on horseback, armed with lances and bayonets, while the first nuclear reactor was built before anyone owned a colour TV.
Perhaps we weren’t ready for something so disruptive. Perhaps that is why some clung to exaggerated fears while others indulged in naive optimism. It may explain why we still can’t come to terms with nuclear power.
Frederick Soddy foresaw this tension early on. The chemist who, with Ernest Rutherford, demonstrated the mysterious transmutation of atoms in 1900, “fervently hoped”, according to his biographer, “that the control of atomic energy would be postponed until society had become sufficiently mature to take responsibility for this achievement of scientific investigation, yet not postponed indefinitely, for then there would be an inevitable energy crisis”.
But when is society ever going to be mature enough? Even with the current revival of interest in nuclear energy, its proponents still seem to be hesitating. Some call for innovations: building smaller reactors, using thorium fuel, recycling highly radioactive waste, or even betting on nuclear fusion. Advocates hope such advances will ease concerns about safety and waste. More crucially, they expect the new designs to be cheaper, faster and easier to build. Just last week, the US military airlifted a compact reactor across three states — a highly visible showcase of President Trump’s ambition to modernise an industry long seen as stagnant.
Yet the sober reality is that none of this has been proven. By chasing innovation, nuclear advocates risk tossing aside a technology that works for experiments that will take decades to deliver. It is as if they’re getting cold feet.
Others in the pro-nuclear camp argue for sticking with the technology we already have. Yet they seem to overlook that large light-water reactors thrived mainly under central economic planning and state-owned utilities. Scaling up “old nuclear” would require societies to move away from liberalised energy markets — a systemic change that is unlikely to happen. In that sense, they may be the real ones with cold feet, delaying the expansion of nuclear power.
It is frustrating to see climate policy move so slowly, in part because the potential of nuclear power remains largely untapped. Our failure to act swiftly has locked in decades of additional emissions and narrowed the options available to us. Future generations may well look back on this period as a profound missed opportunity — one of the great failures of our time. Still, halting global warming will not happen any faster if we keep insisting that nuclear power comes “too late”.
Of course, anything chained to endless regulations can never be a quick fix. Nuclear power may now indeed come too late to avert some of the worst effects of global warming. But the deeper truth is this: it simply came too early.



Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe