Fusion: Book review of “Sun in a Bottle”

Preface. I don’t know of a book or article that better explains fusion and why fusion is so difficult and far from being commercial. Or ever commercial for that matter. Except for hydrogen bombs.

Seife is a science writer, and writes well and clearly, the opposite of the National Research Council books I slogged through on fusion that were so technical you had to be a nuclear engineer to understand them. And it tells a story with evil character Edward Teller, who was even worse than I remembered.

Force anyone who still believes in cold fusion you must get this book, I barely explained the complex fraud and messiness of how it was handled, the politics, personalities of scientists involved and more.

And obviously read the book if this made you interested in how fusion works, I was only able to include a small part of the difficulties.  Plus this is a great way to understand how the scientific method works, how the scientists come up with something.  But unfortunately not endless fusion energy to rescue us from fossil fuel decline.

Since this book was published in 2008, there is a great deal more to be said about ITER, the Lawrence Livermore laser fusion, tritium shortages and so on, check out my other fusion posts at energyskeptic here:

https://energyskeptic.com/category/energy/fusion/

And if you want to do a deep dive, here are free government pdfs on Fusion:

National Academies of Sciences, Engineering, and Medicine. 2021. Plasma
Science: Enabling Technology, Sustainability, Security, and Exploration.
Washington, DC: The National Academies Press. https://doi.org/10.17226/25802 432 pages

National Academies of Sciences, Engineering, and Medicine. 2021. Bringing
Fusion to the U.S. Grid. Washington, DC: The National Academies Press.
https://doi.org/10.17226/25991 125 pages

National Academies of Sciences, Engineering, and Medicine. 2019. Final Report
of the Committee on a Strategic Plan for U.S. Burning Plasma Research.
Washington, DC: The National Academies Press. https://doi.org/10.17226/25331 341 pages

Alice Friedemann  www.energyskeptic.com  Author of Life After Fossil Fuels: A Reality Check on Alternative Energy; When Trucks Stop Running: Energy and the Future of Transportation”, Barriers to Making Algal Biofuels, & “Crunch! Whole Grain Artisan Chips and Crackers”.  Women in ecology  Podcasts: WGBH, Jore, Planet: Critical, Crazy Town, Collapse Chronicles, Derrick Jensen, Practical Prepping, Kunstler 253 &278, Peak Prosperity,  Index of best energyskeptic posts

***

Seife C (2008) Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking. Viking.

Teller realized a hydrogen bomb would be 1000 times more powerful than the atomic bomb. This new weapon, the “Super,” would unleash a power not yet seen on Earth: fusion. Instead of breaking atoms apart to release energy (fission), the superbomb would stick them together (fusion) and release even more energy. While this might seem to be a subtle difference, fusion, unlike fission, had the potential to produce weapons of truly unlimited power. A single Super would be able to wipe out even the largest city-a task far beyond even the bombs of Hiroshima and Nagasaki. A fusion bomb would be the ultimate weapon.

If you put a gram of radium in a sealed ampule, over many, many years the radium will gradually disappear by spontaneously splitting apart and vanishing from view. But they don’t disappear entirely. When an atom of radium breaks apart, it tends to split into two smaller pieces. The heavier of the two is a gas known as radon; the lighter is helium, and the Curies detected them both emanating from their radium sample.

Here was the answer to why it seemed to have of excess energy. The whole atom weighed more than the sum of the parts. When the radium atom spontaneously broke apart, some of its mass changed into energy, just as Einstein’s equation allows. The m had become E. The missing mass was only a tiny fraction of what made up the atom, but even tiny chunks of mass are converted into enormous amounts of energy.

Less than a month before Germany invaded Poland in 1939, Einstein warned President Franklin Delano Roosevelt of the possibility of a bomb made from uranium, a metal that, like radium, releases energy when it breaks into pieces.

FISSION

Uranium is an ideal material for a weapon. Its atoms are very sensitive; hit one with a subatomic particle and it fissions into fragments. Unlike decaying radium, which tends to cleave cleanly into two parts, a fissioning uranium atom shivers into a number of smaller chunks. If enough uranium atoms are in a small enough space-then the process snowballs out of control in less than a blink of an eye. One atom fissions, and its neutrons cause two more to split. These cause four more to fission, causing eight to break apart, then 16, 32, 64, and so forth. After 10 rounds, over 2,000 atoms have split, releasing neutrons and energy. After 20 rounds, it’s more than two million atoms; after 30 rounds, two billion; after 40, over a trillion. This is a chain reaction. A chain reaction, if it gets big enough, can level a city. After 80 rounds, a mere fraction of a second after the chain reaction begins, the result is more energetic than the explosion of 10,000 tons of TNT, roughly the size of the blast that eventually destroyed Hiroshima.

It took two years of cogitation and experimentation for the consensus to build: it was possible to build a powerful bomb out of uranium-235 or plutonium-239.

The core of a nuclear reactor is little more than a controlled chain reaction: a pile of fissioning material that is not quite at the stage of entering a runaway explosion. Scientists arrange the pile so that the number of neutrons produced by splitting atoms is almost precisely the right amount to keep the reaction going without getting faster and faster; each generation of fission has roughly the same number of atoms fissioning as the last. In physics terms, the pile is kept right near critical condition.

Scientists can manipulate the rate of the reaction by inserting or removing materials that absorb, reflect, or slow neutrons. Pull out a rod of neutron-absorbing material and more neutrons are available to split atoms and release more neutrons: the pile goes critical. Drop the rod back in and more neutrons are absorbed than released: the reaction sputters to a halt.

FUSION

In fusion, light atoms stick together, and the whole resulting atom is lighter than the sum of the parts that made it. The missing matter-the stuff that disappears when the light atoms combine-becomes energy. Fusion is several times more powerful than fission; more of the mass of each reacting atom is converted into energy. Better yet, it is much easier to find the fuel for fusion-light atoms like hydrogen-than it is to find the uranium or plutonium fuel for fission.

The fusion reaction is extremely difficult to start, and even harder to keep going long enough to produce large quantities of energy. Atoms tend to repel each other, so it is very hard to get them close enough so that they stick together. You need an enormous amount of energy to slam two atoms together forcefully enough to overcome that repulsion and get them to fuse.

For a fission reaction, you just need to get a lump of uranium big enough. For fusion, you need to manipulate your fuel in some tricky ways. First, you’ve got to compress the fuel into a tiny parcel. This keeps the atoms in close proximity to one another (so they have a chance of colliding). That, in itself, is not so hard; the trick is to keep the atoms very hot as well. Only at tens or hundreds of millions of degrees are the atoms moving fast enough to have a chance of fusing when they do collide. When you heat something, it expands-the atoms try to escape in all directions. Thus, it is very hard to keep a very hot thing compressed very tightly. So, the basic problem in fusion is that it is very difficult to heat something to the right temperature and, simultaneously, keep the atoms close enough together. Without both things working concurrently, a fusion reaction won’t get going.

If you are lucky enough to start a fusion reaction, your own success works against you. When the fusing atoms release energy, they pour heat into their surroundings. This makes the neighboring atoms hotter. The hotter the atoms get, the more the fuel expands and the harder the atoms try to escape. The packet of fuel attempts to blow itself apart. Unless the conditions are just right, a fusion reaction will snuff itself out before it produces any appreciable energy.

If scientists could get a fusion reaction going even for a few fractions of a second, its power would be virtually limitless.

History of fusion

In 1950, British police arrested Los Alamos physicist Klaus Fuchs, who confessed to being a spy. All of a sudden it became clear why the Russians were able to build an atom bomb so quickly. Worse yet, Fuchs had been involved in discussions about the fusion bomb; in fact, he was coholder of a key secret patent having to do with the method used to ignite the first working hydrogen bombs. The Russians knew all about the fusion bomb-and they had likely already begun research. Truman felt he had little choice. Four days later, the president of the United States issued a public statement to his citizens. “It is part of my responsibility as Commander in Chief of the Armed Forces to see to it that our country is able to defend itself against any possible aggressor,” it read. “Accordingly, I have directed the Atomic Energy Commission to continue its work on all forms of atomic weapons, including the so-called hydrogen or superbomb.

Truman’s hand had been forced, but he had just made a dangerous decision. He had committed the United States to an arms race with the Soviet Union that would make both countries insecure and lead the world to the brink of destruction, all for the sake of a fusion weapon that, at the time, was merely a figment of Teller’s fertile imagination.

Teller’s deuterium bomb ran into problems from the start. Even deuterium, which is much easier to fuse than ordinary hydrogen, would be hard to ignite. In 1942, mere months after Teller’s initial visions of the Super, scientists realized that a “hydrogen bomb” should have tritium, a still heavier version of hydrogen, mixed in with the deuterium if it was to have any chance of exploding. The problem is that tritium doesn’t occur much in nature; it has to be manufactured if you want a large quantity of it. And this process required the same resources-and was about as expensive-as manufacturing plutonium.

According to Teller’s estimates, it would require a few hundred grams of tritium-a significant, but manageable, amount of the rare substance-to get the Super working. Those estimates were wildly optimistic. The Ulam-Everett calculations implied that a Super would need ten times that-a few kilograms. There was no way the United States could manufacture that amount of tritium in a reasonable period of time. Teller’s device was impractical.

Teller realized that by adding a tiny bit of tritium to the center of an exploding fission warhead, the tritium would fuse and “boost” the yield of the atom bomb. This was a practical idea-and it ultimately did work-but it was just a way of making a slightly better fission weapon. It was far from the thermonuclear fusion superbomb that Teller had promised. World politics made the situation dire. On June 25, North Korean soldiers marched across the 38th parallel into South Korea. Seoul fell within days. And within two weeks, General Douglas MacArthur was figuring out how best to use nuclear bombs in the conflict. The battle went back and forth. Then, in November, soon after China entered the war, Truman threatened the use of atomic weapons. MacArthur asked for the discretion to use them on the battlefield. The world seemed on the brink of nuclear war. The fusion bomb, the weapon that was supposed to restore America’s military advantage, was nowhere to be found.

Ironically, it was Ulam, the man who brought Teller to tears, who would lift him out of his despair by seeing a way to build a working fusion weapon. In 1951, he realized that he could use the stream of particles coming off an atom bomb to compress the hydrogen fuel, making it hot and dense enough to ignite in a fusion reaction. Instead of a simple bomb with a tank of deuterium, the new hydrogen bomb would have an atom bomb primary separated from a deuterium-tritium secondary. Particles from the atom bomb-radiation that would ordinarily stream away from the explosion-could be focused onto the secondary to compress, heat, and ignite it. It would be tricky to engineer such a device, but it seemed to overcome the problems that dogged the classical Super. “Edward is full of enthusiasm about these possibilities,” Ulam reported to von Neumann. “This is perhaps an indication they will not work.” Nevertheless, the enthusiasm was justified. It would mark the end of the dark times for the fusion hawks, and for Teller. By May, Los Alamos would have experimental data to back up the theoretical calculations.

At 7:14:59 AM on November 1, 1952, roughly a half second ahead of schedule, the island of Elugelab suddenly disappeared. A compact 80-ton device, nicknamed “the sausage,” unleashed the power of the sun upon the Earth for a few moments. The fusion reaction from this device-the first hydrogen bomb-vaporized Elugelab. All that remained was a cloud of dust and fire that stretched 20 miles into the stratosphere. The energy it produced was an astonishing 10 megatons, 50 times bigger than the Greenhouse George shot and about the size of seven hundred Hiroshima bombs.

It had taken too long. The Russians were already hot on the fusion trail. Shortly after World War II, all across the Soviet Union, mysterious secret cities began sprouting up. Among them: Arzamas-16, near Novgorod; Semipalatinsk-21 in Kazakhstan; Chelyabinsk-70 in the Ural Mountains. After decades of speculation and spying, we now know that these were the cities devoted to designing, testing, and building nuclear weapons.

In April 1954 Senator Joseph McCarthy accused Oppenheimer of deliberately delaying the H-bomb by 18 months. The AEC began formal hearings to strip Oppenheimer of his security clearance. The charges against him: various associations with Communists, lying to the FBI about Communist meetings, and strong opposition to the development of the hydrogen bomb in 1949. Oppenheimer was being punished, in part, for not jumping on the fusion bandwagon. The Communist associations would probably have been enough to sink Oppenheimer. Nonetheless, Teller and his allies hammered the hapless physicist for dragging his feet about the Super project. Mercilessly. Ernest Lawrence, Luis Alvarez, and Kenneth Pitzer expressed their doubts, in testimony or in affidavits, about Oppenheimer’s resistance to building a fusion superbomb.

Teller testified, too, and he seemed to relish twisting the knife. “It is my belief that if at the end of the war some people like Dr. Oppenheimer would have lent moral support, not even their own work-just moral support-to work on the thermonuclear gadget . . . I think we would have had the bomb in 1947.” When asked what it would mean to atomic science if Oppenheimer was to “go fishing for the rest of his life,” Teller said that Oppenheimer’s post-Los Alamos work was simply not helpful to the United States. Scientists sympathetic to Oppenheimer would never forgive Teller for his testimony. Teller likened the reception he got from his fellow physicists to his exile from Europe. He wrote: “In my new land, everything had been unfamiliar except for the community of theoretical physicists. . . . Now, at 47, I was again forced into exile.

Eisenhower became president in 1953, and like Truman, he threatened to use nuclear weapons against China. In May 1953, American diplomats made veiled but clear nuclear threats that seem to have helped end the Korean War. Even after that conflict was essentially settled, the nuclear saber rattling against China continued. As the United States was drawn into the China-Taiwan standoff, Eisenhower contemplated the use of nuclear weapons. He considered them similar to any other munition, and in March 1955, at the direction of the president, Secretary of State John Foster Dulles announced that nuclear bombs were “interchangeable with the conventional weapons” used by U.S. forces.

Ordinary hydrogen’s nucleus is simply one proton. It weighs as much as one proton, so it is known as hydrogen-1, or 1H. Deuterium’s nucleus, too, has one proton. But it also has a neutron that weighs roughly the same as the proton; the mass of the nucleus (hence, the mass of the atom) is doubled. Deuterium is thus known as hydrogen-2, 2H. Tritium has a single proton in its nucleus, but in addition it has two neutrons, making it three times as heavy as ordinary hydrogen. Tritium is therefore designated hydrogen-3, 3H. All these atoms are considered to be varieties, or isotopes, of hydrogen. In a chemical reaction, all three behave more or less the same way. But they have slightly different physical properties by virtue of their nuclei’s different weights.

Scientists were thrilled when they discovered the neutron because it gave them a complete model to explain an atom’s chemical behavior. Just figure out how many protons and neutrons are in a given atom and you can predict its properties extremely well. Scientists were astonished that atoms could exist at all. Nuclei are finicky things, and it is amazing that any of them are stable. By rights, they should fly apart instantly. They are filled with positively charged protons, and positively charged things repel one another. If the protons in a nucleus were to obey their electrical urges, they would flee each other’s presence, and the nucleus would explode in all different directions. But something forces the protons to stay put and in close proximity to one another. A very strong force-stronger than gravity, stronger than electromagnetism-glues nuclei together, trapping protons inside. In a great burst of creativity, scientists dubbed this strong force which holds the secret to nuclear fusion.

If there are too many neutrons or too few, the nucleus will be unstable. Hydrogen (one proton) and deuterium (one proton and one neutron) are stable. Left to their own devices, they would not change at all. But add a second neutron to the mix, making tritium, and the atom has too many neutrons for comfort. It is no longer stable. Eventually, a tritium atom will, spontaneously, transmute one of its neutrons into a proton (and spit out an electron in the process). The substance left behind is no longer tritium; it has become helium-3, a stable if rare isotope of helium that has two protons and one neutron. It takes an average of 12 years or so for any given tritium atom to undergo this decay process, but over time, if you have a jar full of hydrogen-3, you will find that it slowly transforms itself into helium-3.

This is an example of a general rule. When a nucleus converts itself from a less-stable variety to a more-stable one, it releases a little bit of energy because some of its mass disappears. And nuclei always “want” to become more stable, just as a ball perched on a hill “wants” to roll down to the bottom. In the process of getting more stable, an atom releases energy, just as a ball rolling down a hill picks up more and more speed as it goes.

Iron-56 (26 protons, 30 neutrons), nickel-62 (28 protons, 34 neutrons), and a few other nearby iron and nickel isotopes are the most stable elements. They are at the very bottom of the valley. All other atoms “want” to be iron, just as a ball anywhere on the slope of a hill “wants” to be at the very bottom.

Fusion is a way for light atoms to roll down the steep hill toward iron. Since the fusion hill is much steeper than the fission one, a fusion reaction yields much more energy than an equivalent fission reaction.

In the 1930s, fusion was soon to solve the puzzle that had so vexed Darwin and Kelvin, and it would answer a question that had bothered humans for millennia: Why does the sun shine? If the sun were, an incandescent ball of liquid, the energy released by infalling matter would only power it for a few tens of millions of years, far short of the time that Darwin’s theory needed to explain the diversity of life on Earth (and far short of the time that other scientists needed to explain geological processes).

Roughly 90% of the suns atoms are hydrogen. About 9% are helium. It was by looking at the sun that scientists discovered helium in the first place. The remaining 1% is mostly carbon, nitrogen, oxygen, neon, and a tiny smattering of heavier elements, but almost all of these are lighter than iron. The sun bears all the hallmarks of being powered by fusion. As the gas cloud gets denser, atoms of hydrogen bump into each other more and more frequently. The collision rate increases dramatically. And as the cloud heats up, its atoms have more energy and collide more violently. The hydrogen atoms jostle each other harder and harder.

Ordinarily, nuclei try to escape from one another. They are positively charged, so they find other nuclei repulsive. When two atoms “collide,” they don’t usually come into physical contact. Once they get within close range, the repulsive forces send them zooming in opposite directions before they actually touch-something like what happens when you try to make two powerful magnets touch each other despite their mutual repulsion. But if the nuclei are moving fast enough-if both atoms are hot enough-then even the mutual repulsion is not enough to keep the nuclei from hitting each other. The two nuclei slam together with great force. This is where fusion begins, and how a sun sparks to life.

Once a cloud of hydrogen gets hot enough and dense enough, it turns into a machine that converts hydrogen to helium, releasing energy. That is how the nuclear furnace at the heart of a star works. Colliding protons release an electron and make deuterium, then deuterium and protons make helium-3, and finally helium-3s make helium-4, producing a lot of energy in the process.

The fusion energy released in the guts of the sun makes it shine. But it also threatens to blow the sun apart. The energy heats the hydrogen gas, making the nuclei slam together harder and harder, and the reaction speeds up, pouring more energy into the cloud. This would appear to lead to a runaway reaction; the furnace should run hotter and hotter and eventually get so energetic that the cloud explodes violently in all directions. However, it turns out that the hotter a cloud of gas is, the more it expands. So when the fusion engine runs hot, the star explands slightly. It becomes slightly less dense and the atoms slam into each other less and less often. The fusion engine slows, and the star cools. Gravity takes over once more, compressing the star, heating it up, and making the fusion energy run hot again. This means that a star is in a delicate equilibrium, caught between the force of gravity and the energy of fusion. The force of gravity and the energy of fusion. The force of gravity tries to collapse the star while the energy of fusion tries to blow it apart.

When that delicate equilibrium fails, the star dies. A fusion engine, no matter how well-balanced, can only run for as long as it has fuel. As a star gets older, its hydrogen supply begins to run out; the hydrogen fusion cycles sputter to a halt. A large star then turns to other light elements to keep itself from collapsing. It begins to fuse helium, turning it into yet heavier elements, such as carbon and oxygen. As the helium runs out, the star fuses heavier and heavier fuels: carbon, oxygen, silicon, sulfur. The fusion engine is rolling further and further down the fusion hill. Soon it hits bottom. The valley of iron.

Fusion gets its energy by making light elements roll down the hill toward iron. Fission gets its energy by making heavy elements roll down the hill toward iron. Iron, already at the bottom of the hill, can’t yield energy through fusion or fission. It is the dead ashes of a fusion furnace, utterly unable to yield more energy. When a star runs out of other fuels, its iron cannot burn in its fusion furnace. The fusion engine has nothing that it can turn into energy, so it shuts off and the star abruptly collapses. Depending on the star’s nature, it can die a fiery death: the final collapse ignites one last, violent burn of its remaining fuel, blowing up the star with unimaginable violence. A supernova, as such an explosion is called, is so energetic that a single one will typically outshine all the other stars in its galaxy combined. The star spews its guts into space, contaminating nearby hydrogen clouds in the process of collapsing into new stars. This is what happened before our sun was born; it got seeded with the nuclear ash of a supernova explosion. All the iron on Earth, all the oxygen, all the carbon-almost all the elements heavier than hydrogen and helium-are the remnants of a dead fusion furnace. We are all truly made of star stuff.

When Hans Bethe solved the riddle of the sun’s energy, the idea of a fusion bomb seemed absurd. To ignite a fusion reaction, you need to have a bunch of light atoms that are extremely hot (so they have enough energy to overcome their mutual repulsion and slam into each other) and extremely dense (so they are close enough to one another that they collide frequently). The laws of nature seem to conspire against having those two conditions at the same time: hot things expand, reducing their density. The only reason a star can keep a stable fusion engine going is because it is so massive. It is held together by the intense force of its own gravity, resisting the explosive force of the fusion engine in its belly. A cloud of hydrogen smaller than a star doesn’t have the benefit of that gravitational girdle keeping the reaction from puffing out. Even if you are somehow able to start a fusion reaction, it will blow itself up and snuff itself out in moments.

It is extraordinarily hard to get fusion going outside a star.

Even the biggest explosion of all-the big bang-couldn’t get fusion going for more than a few minutes. In the first seconds after the big bang, all the matter in the universe-lots of fundamental particles, including a whole bunch of protons-was contained in a relatively small, intensely hot space. Protons-hydrogen nuclei-fused to create helium. But the universe was expanding rapidly because of its own explosive energy. After about three minutes, the universe had expanded so much that the matter wasn’t dense enough to fuse anymore. As hot and dense as the early universe was, it could only sustain fusion for a measly three minutes. After that, there wasn’t any fusion going on in the universe, not until it occurred again in the heart of a solar furnace. To get fusion going on Earth, you must create conditions that are not as hot as the first few minutes after the big bang. And without the massive gravity of a star, it is nearly impossible to keep those conditions going for very long.

This was the major obstacle to designing the hydrogen bomb. Even with deuterium and tritium (or lithium) as fuel-deuterium and tritium are relatively easy to fuse-it is hard to make the fuel hot enough and dense enough to get the nuclei fusing. And if you can initiate fusion, you have to maintain the high temperatures and high density long enough to generate an appreciable amount of energy.

As a substance heats up, it radiates energy more rapidly. In fact, the radiation goes up as the fourth power of the temperature: double the temperature of an object and it radiates its energy sixteen times as fast. To ignite fusion, the fuel has to get to tens or hundreds of millions of degrees (depending on the density of the fuel). Yet even then, it will radiate its energy away at a tremendous rate; it is almost as if everything in the universe is trying to cool it down. And if he had been lucky enough to ignite fusion, he would have been unable to keep the reaction from blowing itself apart with its own energy just as it got going.

5DETONATION OF A HYDROGEN BOMB: (a) A fission bomb explodes at one end of the device, sending x-rays in all directions. (b) The x-rays strike the walls of the container, causing them to evaporate and radiate more x-rays. These x-rays hit the container containing deuterium and tritium, causing it to implode. (c) The compressing deuterium and tritium fuel heats up and ignites a fission “spark plug” at the center of the device, causing it to explode outward. Trapped between the imploding container and the exploding spark plug, the fuel ignites in a fusion reaction. The reaction only lasts for a fraction of a second before it blows itself apart, but in the process it releases an enormous amount of energy.

Ivy Mike was equivalent to ten megatons of TNT, but there was no reason why the whole device could not have been scaled up by adding a third stage . . . or a fourth. In 1961, the Russians detonated a (roughly) 50-megaton whopper nicknamed the Tsar Bomba, the most powerful weapon ever built by man.

As early as 1949, scientists realized that after about 150 megatons, hydrogen bombs simply take a huge column of air and lift it into outer space, punching a hole in the atmosphere about 14 miles across. Bigger bombs would not do much more than that. They would radiate most of their energy uselessly into space. So after 150 megatons, there was no point in getting bigger, unless you wanted to build a fusion device large enough to destroy the Earth. Not even the most rabid hawks were in favor of that. Nevertheless, with Ivy Mike and its successors, the fusion bomb scientists had succeeded at creating a tiny star on Earth. For a fraction of a second, scientists were able to get a fusion reaction going. They had figured out how to use that energy for war. It would be much, much harder to harness that energy for peace.

On January 15, 1965, deep inside the Soviet Union, a nuclear rumble shook the earth. The Americans were the first to spot the radioactive cloud as it floated over Japan and toward the Pacific Ocean beyond. The Soviets had detonated a fusion bomb, and the fallout was contaminating the atmosphere over the Japanese mainland. The explosion, and the resulting fallout, was the beginning of a top-secret Soviet project, Program No. 7. Starting with the mysterious January 1965 explosion and continuing for the next twenty-three years, Program No. 7 would use fusion weapons to dig canals, build underground storage caves, turn on and shut off gas wells, and change the face of the Earth. Russian scientists, in a fraction of a second, had carved a major lake and a reservoir, now known as Lake Chagan, out of bedrock.

Program No. 7 was not the only secret government project to harness the power of fusion. An equivalent program was already under way in the United States. A few years earlier, American scientists began work on Project Plowshare and started drawing up plans to use nuclear weapons to create an artificial harbor in Alaska, widen the Panama Canal, and dig a second Suez canal through Israel’s Negev desert.

We could visit the outer reaches of the solar system and even visit nearby stars. We would never have to worry again about dwindling energy supplies, oil crises, or global warming. Of course, it wouldn’t work out quite that nicely. Films touted the brilliant future revealed by nuclear power. In the 1952 short A is for Atom, a giant glowing golem, arms crossed, represented “the answer to a dream as old as man himself, a giant of limitless power at man’s command.

Lewis Strauss, the AEC chairman and Teller backer, promised the world a future where the energy of the atom would power cities, cure diseases, and grow foods. Nuclear power would reshape the planet. God willed it. The Almighty had decided that humans should unlock the power of the atom, and He would keep us from self-annihilation. “A Higher Intelligence decided that man was ready to receive it,” Strauss wrote in 1955. “My faith tells me that the Creator did not intend man to evolve through the ages to this stage of civilization only now to devise something that would destroy life on this earth.

The potential of fission seemed microscopic compared to the unlimited power of fusion, and this is what excited Edward Teller so much. Fusion couldn’t just generate energy, it could move mountains. Literally. Teller was going to make it happen. “If your mountain is not in the right place,” he once said at a press conference, “drop us a card”.

Teller and his Livermore colleagues immediately seized upon Suez as an opportunity; they announced that fusion could solve the Egyptian problem by using the power of fusion to dig a second canal, eliminating the Suez threat once and for all with a chain of hydrogen bombs, exploding across Israel’s Negev desert to cut a second channel from the Mediterranean to the Red Sea. Egypt would no longer have a monopoly. Or we could blow up rapids to make rivers navigable. Cut trenches to help irrigate crops. Straighten the route of the Santa Fe Railroad. Mine coal and rare minerals. Free oil and gas reserves. “We will change the earth’s surface to suit us,” Teller wrote. Teller also suggested using hydrogen bombs to change the weather, to melt ice to yield fresh water, and to mass-produce diamonds, and even found the idea of bombing the moon incredibly enticing. “One will probably not resist for long the temptation to shoot at the moon . . . to observe what kind of disturbance it might cause,” he wrote. President Kennedy ordered the AEC to tackle the problem of building a second Panama canal with fusion devices.

In Russia similar experiments were tried. Project No. 7 had a little more success than Project Plowshare. After the creation of Lake Chagan, the Soviets briefly experimented with nuclear excavations of lakes and dams, but the results were disappointing. The Russian efforts to turn on gas and oil wells with bombs were more successful than the American tests. Production often increased dramatically. But reports indicate that at least one oil field is contaminated with radioactivity, and its oil is “not acceptable to regional refineries. In 1966, the Soviets used a nuclear bomb to shut off a gas well and snuff a runaway fire. They also used nuclear explosives to make underground caverns for storing toxic waste, to break up mineral ores, and to create seismic shockwaves to aid in the exploration for natural resources. All in all, Project No. 7 consisted of 122 nuclear explosions between 1965 and 1988. Their results were mixed at best. Hydrogen bombs, it turned out, did not give humanity the power to move mountains or to reshape the landscape to suit its fancy.

An exploding nuclear bomb is a veritable treasure trove of radioactive debris: the unfissioned uranium and plutonium from a bomb’s primary as well as lighter radioactive atoms left behind by the uranium and plutonium that did fission. A great burst of neutrons also accompanies a large blast; these neutrons strike surrounding atoms-in the atmosphere, in the dirt, in people-with great force. Occasionally these neutrons stick, changing once-stable atoms into radioactive ones. Neutrons can turn ordinary material into a radioactive mess, a phenomenon known as neutron activation. Neutron-activated material, catapulted high into the air, falls to earth downwind of a nuclear explosion, irradiating anyone unfortunate enough to come into contact with this fallout. (Radiation strips electrons from DNA and alters its structure, killing cells and causing cancers.) If a nuclear explosion is powerful enough, it sends radioactive debris so high into the atmosphere that fallout can descend halfway around the globe.

As radioactive as the Nagasaki and Hiroshima bombs were, the multimegaton blasts of fusion weapons were much worse.

At 6:45 AM on March 1, 1954, the United States detonated a hydrogen bomb; ground zero was a reef in Bikini atoll. The explosion was much bigger than expected-15 megatons, the largest explosion yet-roughly equivalent to 1,000 Hiroshima-sized bombs. The fireball pulverized the coral reef, sending pieces flying thousands of feet into the air. By 8:00 AM, “pinhead-sized white and gritty snow” began to shower the American fleet observing the test. This was highly radioactive fallout. The radiation levels on the ships rose rapidly, and the fleet immediately steamed south to escape.

The Castle Bravo accident marked a turning point in the perception of nuclear tests. Every time such a test weapon exploded, it spewed radioactive ash into the atmosphere, and scientists noticed that the world was becoming increasingly radioactive as a result. As tests continued, the problem got worse. Scientists were particularly concerned about a radioactive isotope of the metal strontium: strontium-90. Produced by fission in an atomic or hydrogen bomb, strontium-90 is metabolized in a way similar to calcium. It is readily taken up by the body, especially a child’s body, and is deposited in bones, teeth, and mother’s milk. Once it is inside the body, it destroys from within. And observers were detecting more and more strontium-90 worldwide.

By the mid-1950s, scientists such as Albert Schweitzer and Linus Pauling were raising the alarm. “Each nuclear bomb test spreads an added burden of radioactive elements over every part of the world,” read a Pauling-drafted petition from 1957. “Each added amount of radiation causes damage to the health of human beings all over the world and causes damage to the pool of human germ plasm such as to lead to an increase in the number of seriously defective children that will be born in future generations. All the data showed that concentrations of strontium-90 were doubling every two years.

Teller tried consistently to squelch the growing fears about fallout. The radiation from atomic testing is “very small,” he argued. “Radiation from test fallout might be slightly harmful. It might be slightly beneficial.” He ridiculed the public’s concerns. Teller even suggested that the dead captain from the Daigo Fukuryu Maru might have died from hepatitis, not from radiation exposure. In his view, the “fallout fear-mongers” were damaging the security of the United States because they were threatening to end his nuclear schemes.

On Moscow Radio, Andrei Gromyko, the foreign minister, announced the “cessation of tests of all forms of atomic and hydrogen weapons in the Soviet Union.” The world wanted a solution to the growing fallout problem and a stop to the nuclear arms race, and the Soviet Union, unlike the United States, had responded. Stopping nuclear testing was tantamount to surrendering America’s nuclear advantage to the Russians. Teller would do almost anything to stop it from happening. The mask had come off. Teller’s opposition to the test ban had little to do with a vision of a fusion utopia. His future was not a future of peace, but of war. He had tried to stop the test ban because he wanted the United States to be prepared for tactical nuclear war with the Soviet Union. He had used the promise of peaceful nuclear explosions as a tool to ensure continued military research-and to make sure that weaponeers had more bombs to design. Teller’s Plowshare was not a vision from the prophet Isaiah, but one from the prophet Joel: “Beat your plowshares into swords, and your pruninghooks into spears: let the weak say, I am strong.

The so-called Limited Test Ban Treaty was signed in 1963; this was the agreement that banned anything but underground nuclear explosions. It also forbade any tests that allowed radioactivity to leak beyond national borders. Teller bitterly fought the treaty. He and his allies attempted to undermine it at every turn. Throughout the negotiations, the AEC kept trying to build a loophole into the treaty’s language saying that peaceful nuclear explosions-Plowshare-should be exempt. Every American draft of the treaty had that exemption written into it. The Russians were against the exemption; they countered that “peaceful” nuclear bombs were “superfluous and even dangerous.” After years of negotiation, Kennedy gave up on the exemption, and the accord was signed. However, it was only a matter of months before both sides were violating the brand-new agreement.

Of all the marvelous applications suggested by Teller and his colleagues, only one was actually tested. Three nuclear tests carried out in the late 1960s and early 1970s, code-named Gasbuggy, Rulison, and Rio Blanco, attempted to use nuclear explosions to release natural gas. (The theory was that a big enough explosion would fracture the rocks trapping the gas.) Rio Blanco failed because the bomb didn’t produce caverns of the expected shape. At first, Gasbuggy and Rulison seemed to work. The nuclear bombs shattered rocks around the test site and natural gas poured out of the wells. Unfortunately, the gas was radioactive, and no utility would buy it.

After 12 years of trying and 27 nuclear tests, Project Plowshare sputtered to a halt without ever having proved the usefulness of peaceful nuclear bombs. Fusion bombs were just that-bombs. They were swords too crude to be shaped into plowshares, unable to benefit humanity in any tangible way. Scientists would have to come up with entirely new ideas if they wanted to harness the power of the sun without getting burned.

President Ronald Reagan’s infamous “Star Wars” program in its first incarnation, would have seeded the heavens with fusion bombs.

The sun itself needs no bottle. It is held together by its own gravity; the mutual attraction of all its atoms is able to keep the fusion engine in its belly from blowing itself apart. But any lump of material smaller than a star does not have enough gravitational force to counteract the enormous pressure of an expanding fusion reaction. For humans to succeed in making an earthbound sun, scientists would have to figure out how to contain the fusion reaction with an external force-figure out how to bottle it up.

In the early 1950s, the need was growing urgent. In the past, the United States had always produced more energy than it needed, but that trend was rapidly changing. Economists and scientists knew that by the end of the 1950s, America would have to begin importing fuel-oil-to keep its economy going. The West was getting its first taste of oil addiction, and it wasn’t pleasant. Fusion energy-if scientists could design a bottle to contain it-could prevent a future where the Western world was kept hostage to a dwindling and increasingly expensive supply of foreign oil.

Scientists wanted a reactor that produced energy by fusing hydrogen into helium, and they wanted it to be stable, unlike the dangerous evanescent explosion of a fusion weapon. To create a workable reactor that would tap the unlimited potential of fusion energy, scientists needed to build a sun in a bottle.

At first glance, it seems impossible to make a bottle sturdy enough to contain a burning sun. What kind of material is strong enough to hold a fusion reaction? To get even the most fusion-friendly atoms to stick to one another, they have to slam together hard enough to overcome their mutual electric repulsion, so the atoms have to be extraordinarily hot-tens or hundreds of million degrees Celsius. But matter at such high temperatures is very hard to contain. It is hotter than anything on Earth, far hotter than the melting point of steel. Even a diamond vessel would instantly evaporate in temperatures that extreme. Million-degree substances act almost like universal solvents, eating through whatever substance you put them in. Nothing on Earth would be able to contain such hot matter, at least not without some extraordinarily clever tricks.

At a high enough temperature, all the electrons are stripped from their nuclei. The electrons are still nearby, unattached to any particular nucleus. Unbound electrons and nuclei roam in one big blob, unattached to each other. At extremely high temperatures a hunk of hot matter becomes an undifferentiated soup of unconnected negatively charged electrons and positively charged nuclei. This is a plasma. Pour enough energy into a piece of matter-heat it enough-and atoms lose their individuality. The positively charged nuclei are still attracted to the negatively charged electrons, but they are not bound together. And this gives a plasma some unusual properties. Unlike most kinds of ordinary matter-unlike most solids, liquids, and gases-the free-floating electrons and protons of a plasma are strongly affected by electric and magnetic fields.

Spitzer’s bottle would be made of invisible lines of force: it would be made of magnetic fields. By the twentieth century, these fields were extremely well understood. Nonetheless, even simple rules can have seemingly complicated consequences.

In a plasma, you have a large number of charged particles-electrons and nuclei-moving about at relatively high speeds. These moving particles generate magnetic fields. These magnetic fields change the motion of the moving particles. When the motion of the moving particles change, so do the magnetic fields that they are generating-which changes the motion of the particles, changing the magnetic fields, and so on. Add to that the electric attraction that the electrons and nuclei feel for each other and you’ve got an incredibly complex soup.

To Spitzer, the mere fact that the plasma responds to magnetic fields suggested a way to bottle it up. He realized that if you had a plasma moving through a tube and you subjected that tube to a nice, strong magnetic field in the proper orientation, the charged particles in the soup would be forced to move in little circles. They would spiral down the tube in tight little helices, confined by the magnetic field, never even getting close to the walls of the cylinder. The plasma would be confined. In theory, even an extremely hot plasma could be trapped in such a bottle. Furthermore, it was fairly easy to generate the right sort of magnetic field: just wrap a coil of wire around the tube and put a strong current through it; the moving charges in the wire create just the sort of field that is needed.

The problem with Spitzer’s tube was that it bottles the plasma on the sides, but not at the front or the back of the tube. When the moving plasma reaches the end of the tube, it spills right out. So what to do? Spitzer’s next clever idea was to imagine a tube without end: a donut. Such a donut (or a torus, as physicists and mathematicians like to call it) is just a tube that circles back upon itself. The plasma would move around and around in a circle, never spilling out the end of the tube. With the right magnetic field, it would be the perfect bottle.

Unfortunately, the very curvature that gets rid of the end of the tube makes it very difficult to set up the right kind of magnetic field. The straight tube merely needed a wire curled around it. But when you bend that tube into a donut, the loops of wire on the inside of the donut get bunched up and those on the outside get stretched and spaced out. As current flows down the wire, the magnetic field is stronger where the loops of wire are close together-near the donut hole-and weaker at the edge where the loops are far apart. The nice, even magnetic field in the tube is destroyed; it becomes an uneven mess with a strong side and a weak side. This unevenness is a big problem: it causes the nuclei and electrons in the plasma to drift in opposite directions and into the walls of the container. The bottle quickly loses its contents. It leaks. A straight tube leaks out its ends; a curved tube leaks through its sides.

So better to have two half donuts. These half donuts would be connected by tubes that crossed each other: a figure eight. The half-donut sections have the same problem as the full-donut bottle: the electrons and nuclei drift in opposite directions. However, because the tubes cross each other, the plasma winds up going through one half donut clockwise and the other one counterclockwise. This means that the drift on one side should be cancelled by an equal and opposite drift when the plasma goes through the other half donut. It doesn’t quite work that way; the drifts don’t cancel exactly, and the plasma still leaks out a bit, but the leak isn’t quite as severe as it would be in a torus-shaped bottle.

Shortly after the Livermore laboratory was founded, some of its scientists proposed a slightly different shape for a magnetic bottle. They would stick with a straight tube. Instead of wrestling with the problems caused by curving the plasma’s path, they would try to cap the ends of the tube by tweaking the magnetic fields slightly. Strong magnetic fields at the ends of the tube and slightly weaker magnetic fields at the center would create barriers that would behave almost like a mirror. Some-not all-of the plasma streaming to the end of the tube would be reflected back inside. This magnetic mirror was extremely porous, so it was clearly not a perfect bottle.

A third contender came from across the Atlantic. In the late 1940s, British scientists were also beginning to think about confining plasma, and their method relied on an entirely different phenomenon that they called the “pinch” effect. A pinch starts with a cylinder of plasma. Since the electrons are free to move around inside the cloud, the plasma itself conducts electricity; it’s almost like a piece of copper. You can send an electric current along a plasma cylinder just as you would along a copper wire. And just as in a copper wire, the current running down the plasma creates a magnetic field. But this magnetic field affects the particles in the plasma; it forces them toward the center of the cylinder. The current compresses the cylinder, crushing it toward its central axis. The stronger the current, the greater the effect, and the faster and tighter the plasma gets squashed. As an added benefit, the squashing heats the plasma. This is the pinch effect.

Kruskal and Schwarzschild had discovered that a pinched plasma was like a ball perched on a hill. The slightest disturbance would destroy it. Send a current through a cylinder of plasma and it indeed squashes itself into a dense little filament of hot matter. But the filament is unstable. If it is not perfectly straight, if it has even the tiniest kink, the magnetic fields generated by the pinching current immediately exaggerate and expand the kink. This makes the kink grow, getting more and more pronounced. Any little imperfection in the plasma filament rapidly becomes a huge imperfection. In a tiny fraction of a second, the plasma kinks, bends, and writhes out of control.

As soon as the Perhapsatron started up in 1953, the Princeton team’s calculations were proved correct. The Los Alamos experimenters found that as soon as they got a pinch, forming a nice, tight filament in the center of the Perhapsatron’s chamber, it went poof!

In 1954, Edward Teller figured out that a plasma held in place by magnetic fields was unstable under certain conditions. The magnetic fields behave somewhat like a collection of rubber bands: as the plasma pressure increases, they try to relieve the increasing tension by writhing. “They try to snap inward and let the plasma leak out between them,” Teller wrote. This system was also unstable. Even a tiny irregularity in the magnetic field would rapidly get worse, and scientists would lose control of the plasma. The so-called Teller instability affected the Stellarator as well as Livermore’s magnetic mirror approach. Instabilities were everywhere.

By the mid-1950s, all three groups had enormous difficulties to overcome. Their plasmas were unstable and their bottles were leaky. They spent ever-increasing amounts of money building bigger and more elaborate machines in attempts to get unstable plasmas under control. When, early in 1955, the Los Alamos researchers turned on their newest, biggest pinch machine, Columbus I, they saw a burst of neutrons every time they pinched the plasma hard enough. Pinch. Neutrons. Pinch. Neutrons. No pinch, no neutrons. It seemed like a great success. From the number of neutrons they were seeing, the pinch scientists concluded they had attained fusion; the plasma inside the Columbus machine must have been heated to millions of degrees Celsius. But not everybody was convinced. Researchers at Livermore were skeptical that the pinch machine could reach the temperatures advertised. Thus, the plasma couldn’t possibly be hot enough to ignite a fusion reaction. So where were the neutrons coming from? To their chagrin, they soon discovered that the neutrons coming out the front of the Columbus machine were more energetic than the ones coming out the rear. In a true thermonuclear reaction, during which nuclei in a hot plasma are fusing with one another, the neutrons from the reaction should be streaming out in all directions with equal energy.

The asymmetry provided a crucial clue. The scientists pinched the plasma by running a current through it. Neutrons that were flying out of the machine in the direction of the current had more energy than those that flew out against it. This revealed that the neutrons were the work of another instability. Just as a pinched filament is unstable when kinked slightly-because the kink grows and grows-it is unstable when a small section gets pinched a little bit more than the rest of the plasma. In this case, the small pinch grows progressively more pronounced; the plasma gets wasp-waisted and pinches itself off. The plasma begins to look like a pair of sausages. This is a sausage instability, and it creates some strong electrical fields near the pinch point. These fields accelerate a small handful of nuclei in the direction of the pinch current. These nuclei then strike the relatively chilly cloud of plasma and fuse, releasing neutrons. Doing that, scientists had concluded, would always consume more energy than it produced. The neutrons produced by the instability, dubbed instability neutrons or false neutrons, weren’t a sign of energy production-just the opposite.

The neutrons were the sign of energy consumption, not energy production.

The world’s press ignored the American research and celebrated Britain’s triumphant conquest of fusion energy. In England, tabloid papers blasted the news across their pages, promising “UNLIMITED POWER from SEA WATER”: no more electricity bills, no more smog, no need for coal, power that would last for a billion years. Newspapers around the globe followed suit; they were quick to trumpet the prospect of limitless energy, energy that would be at humanity’s fingertips within two decades. No longer would any nation be held hostage because of a lack of oil. Other nations began to emulate the British. The Swedes announced that they were building a ZETA-like device that could compete with the one at Harwell. Just two weeks after the announcement, Japanese scientists announced that they, too, had achieved thermonuclear fusion-and they were producing more neutrons than the British were. The Russians also started building a ZETA clone. But the Britons weren’t going to fall behind: by early May, they were busy upgrading ZETA and were planning a more powerful (and more expensive, at $14 million) machine, ZETA II. Its designers thought that ZETA II would heat plasmas to one hundred million degrees and produce more energy than it consumed. It would be the world’s first fusion power plant.

The dream had come crashing down. Once again, the culprit was those damn false neutrons.

Scylla was able to heat deuterium to more than ten million degrees. Scientists began to detect all the expected products of deuterium-deuterium fusion: protons, tritium nuclei, and neutrons. Tens of thousands of neutrons. With a few months of tinkering, physicists were getting roughly twenty million neutrons every time they ran the machine. It was a stunning success after so much failure.

Plasmas were proving very tough to control. In part, this was because plasmas were like nothing else scientists had encountered in nature. Plasmas behaved something like fluids, but unlike standard fluids, they interacted in extremely complex ways with magnetic and electric fields. Because of that electromagnetic component, understanding plasmas was becoming an entirely new discipline vastly more complicated than the hydrodynamics field that dealt with the behavior of ordinary fluids.

The laws of electromagnetism say that electrical currents spawn magnetic fields, and magnetic fields spawn electrical currents. This means that an electrical current traveling down the plasma will generate magnetic fields that generate electrical currents that generate magnetic fields, and so forth-and all these effects change the motion of the particles in the plasma, forcing them toward the center of the cloud. The deeper the physicists looked into plasma dynamics, the more strange effects they saw-secondary and tertiary and beyond-most of which made the plasma unstable.

This feedback between electric fields and magnetic fields is just one of many effects that make plasmas hard to predict. Another has to do with the density of the plasma. Electric currents behave differently in plasmas of different densities and pressures. Yet another issue had to do with the very makeup of the plasma. Scientists had been trying to ignore the fact that a plasma is not a nice, homogeneous substance made of a single kind of particle. A plasma is made of very heavy positively charged particles (the nuclei) and very light negatively charged particles (the electrons stripped from the atoms). These two kinds of particles have different properties and behave differently even when they are at the same temperatures, at the same pressures, and subjected to the same electromagnetic fields.

Physicists discovered that when they tried to heat a plasma, unless they were very careful they would pour most of the energy into the light (and easy to accelerate) electrons, leaving the heavy nuclei cold, unheated, and slow. This was really bad news. The whole point of heating the hydrogen plasmas was to heat up the nuclei so that they were moving fast enough to fuse; hot electrons and cold nuclei were all but worthless. Unless scientists could compel the hot electrons to share their energy with the nuclei, there would be no hope of fusion.

The brew did not behave the way scientists expected it to. It seemed to have a mind of its own, thwarting all attempts to keep it under control. Pinch it or squeeze it or even try to keep it confined in a magnetic trap and it writhed around and ruffled itself in instability after instability. Physicists built bigger and more expensive machines to wrestle the instabilities into submission, but they were failing. As the machines started costing millions and tens of millions of dollars, the scientists were no closer to building a fusion reactor than before, they were just uncovering more and more subtle ways that the plasma fought their will. As the scientists turned up the magnetic fields, they were surprised to discover that particles still zoomed out of control very quickly. Simply turning up the strength of the fields was not enough to bring the losses down to a reasonable level. Perhaps because the goal of a fusion reactor was so far out of reach, all the nations working on fusion energy decided to share their knowledge. The stakes had been lowered; there was no obvious path leading to limitless energy, so there was no harm in international collaboration.

Sakharov’s scheme appeared to provide an answer. His bottle looked little different from the ones the Americans and British were proposing. It was donut shaped-toroidal-and used coils of wire to induce magnetic fields, earning it the cumbersome name toroidal chamber with magnetic coil, or tokamak for short. But the tokamak was a bottle with a difference. Whereas the Stellarator used external magnetic fields to contain the plasma and the pinch machines used internal electric currents to squash it, the tokamak did both. The tokamak has multiple sets of coils. One group of coils sets up a magnetic field that constrains and stiffens the plasma; it’s an external magnetic bottle.

What gives the tokamak an extra bit of oomph is another set of coils that pinches the plasma. When scientists send a current through those coils, it induces a corresponding pinching current in the plasma circulating in the torus. This one-two punch of the external magnetic fields and internal current gave scientists a tool that, they hoped, would keep a hot, dense plasma stable for a long time. Of course, the tokamak design had drawbacks as well.

Unfortunately, the current in a tokamak’s plasma is just one more thing that can fail. If an instability causes the current to drop momentarily, things get very bad very quickly. The plasma suddenly loses its pinch and explodes in all directions. This event is called a disruption, and it can be extraordinarily violent. It can even damage the machine. (One disruption at a modern British tokamak made the whole thing, all 120 tons of it, jump a centimeter into the air.) However, the disadvantages of the tokamak soon seemed small compared to the advantages of the design.

In 1960 a short paper in Nature gave the physics community a powerful new tool. There are many methods of generating light. If you heat something high enough, it begins to glow. When a substance is energetic enough, it emits visible light. This is how an incandescent lightbulb works; the filament in the bulb is simply heated to a very high temperature. It is a law of nature: the hotter an object is, the more light waves it emits. Or, if you prefer, you can think of the emissions as light particles rather than light waves. A particle of light-a photon-can interact with matter in a number of different ways. It can strike an atom and give it a kick. It can make the atom rotate or move in other manners. If the photon is just the right color, the atom can absorb it. Absorbing a photon “excites” the atom, packing it full of the energy that once resided in the light particle. This excited atom will soon disgorge the photon, emitting a light particle of precisely the same color and relaxing from its excited state.

In 1917, Albert Einstein made a curious prediction about excited atoms. Such an atom is quivering with energy, looking for an excuse to spit out the photon it has absorbed. Einstein’s calculations showed that if a photon of the right color happens by-one precisely the same color as the one absorbed by the atom-then the atom will immediately disgorge a photon. This photon not only will be precisely the same color as the passerby but will also move with it in lockstep. The two photons will behave almost as a single object. This phenomenon is known as stimulated emission, and it is the mechanism the laser uses to produce its beam of light. Imagine that you have a hunk of material-a whole lot of atoms-that you want to turn into a laser. The first step is to excite all the atoms. You do this by “pumping” the material full of energy. It doesn’t matter how. Some lasers pump a material with electricity. Some lasers do it with light, and some with chemical reactions.

This is where the clever part happens. Send a photon of that specific color into the material. The photon encounters an excited atom, which then disgorges a second photon of the same color through stimulated emission. These two photons move in lockstep. They encounter another excited atom, which emits another photon of the same color: three photons now in lockstep. Another excited atom, another photon: four photons, all the same color, all moving in precisely the same way. As the photons move through the material, they encounter more and more excited atoms, which emit more and more photons. The beam snowballs, growing bigger as it travels through the material. By the time it finally emerges, the beam consists of an enormous collection of light particles. It is an intense beam, and all the photons have the exact same color and are moving in lockstep, almost like one enormous particle of light. This is the secret to the laser’s power. The photons in an ordinary beam of light are like are an unruly mob; the photons in a laser beam are an army marching together with a single mind.

The tight beam allows it to travel great distances-to the moon and back, even-without scattering and dissipating too much. Because the beam is made of photons of the exact same color, it provides a great way to measure very, very hot temperatures. Shine a laser at a plasma. The photons in the beam will begin with the exact same color. But as the photons strike the fast-moving particles in the plasma, the plasma gives the photons a kick, adding a bit of energy to them, shortening their wavelengths and making them slightly bluer. By looking at the color of a laser beam after it hits a plasma, scientists can calculate the energies of the particles in the plasma, which in turn reveals the temperature.

When the British scientists shined a laser beam at Artsimovich’s tokamak plasma, they saw that the Russians were not exaggerating. Their plasma was tens of millions of degrees, dense, and relatively well confined. The tokamak was performing much better than the other forms of magnetic bottles. This was wonderful news for the fusion community in the West. Sakharov’s invention showed a way to bypass the troubles of the pinch machines and the Stellarators. Practically overnight, plasma physicists across the world scrapped their devices and built tokamaks.

The laser was about to change the landscape even more dramatically by providing an alternative to the magnetic bottle.

Lasers produce particularly intense and yet easily controlled light beams. You can point a laser with great precision and make it dump an enormous amount of energy in a very tiny space. To Andrei Sakharov, this suggested that laser beams could be used to heat and contain a plasma of hydrogen. If it worked, laser fusion would be an even more straightforward method than using magnets. One could simply shine laser light on a pellet of deuterium fuel from all directions: the beams would heat and compress the pellet, creating a tiny fusion reaction-a miniature sun girdled on all sides by light. The plasma would be compressed not by magnetic fields but by particles of light. This was the birth of inertial confinement fusion.

Livermore scientists began building laser bottles intended to ignite and contain plasma. A true laser-based bottle would require laser beams to hit the target from all sides at once to fully confine it.

Not only was Siegel using lasers to ignite fusion, but he was doing it as the head of a private company, not as a scientist in a government laboratory. The public took this as a sign that private industry was embracing fusion reactors as a viable source of energy. Siegel, the entrepreneur, exuded confidence in public. He was sure, he said, that he could turn lasers into “efficient fusion power” within “the next few years.” After false starts and two decades of struggle with magnetic bottles, the era of fusion finally seemed at hand. The timing could scarcely have been better. The United States was just getting through its first oil crisis. Congress immediately seized upon it and started pouring money into fusion research. Laser fusion saw a dramatic increase in funding, growing from almost nothing to $200 million per year by decade’s end. Livermore and some other laboratories around the country, particularly those at Los Alamos and at the University of Rochester in New York, began to plan massive laser projects with an eye toward creating a viable fusion reactor.

Magnetic fusion budgets doubled and doubled and doubled again. In 1975, more than $100 million went to magnetic fusion; by 1977, more than $300 million; and by 1982, almost $400 million.

Livermore’s Janus was already in 1975 suffering from a major snag. Its lasers were extremely powerful for their day, pouring an unprecedented amount of laser light into very tiny spaces. Livermore’s scientists managed to get this level of power by taking enormous slabs of glass made of neodymium and silicon and exciting them with a flash lamp. This glass was the heart of Livermore’s laser. The slabs were what produced an enormous number of infrared photons in lockstep. The resulting beam exited the glass and was bounced around, guided by lenses and mirrors to the target chamber. However, the beam was so intense that it would heat whatever material it touched. This heat changed the properties of lenses, mirrors, and even the air itself. When heat changes the properties of a lens or a mirror, it alters the way the device focuses the beam. These little changes in focus would start creating imperfections in the beam, such as hot and cold spots. These could be disastrous. The hot spots in the beam would pit lenses, destroying them in a tiny fraction of a second. Every time they fired the Janus laser, the machine tore itself to shreds.

Their next-generation fusion machine, Argus, used a clever technique to eliminate those troublesome hot spots. By shooting the beam down a long tube and carefully removing everything but the light at the very center of the beam, the scientists would be assured of getting light that was uniform and pure-and free of hot spots. This meant that the laser had to be housed in a very large building to accommodate the tubes, which were more than 100 feet long. More serious was the problem with electrons. Magnetic fusion researchers had trouble heating the plasma evenly; the lightweight electrons would get hot faster than the heavyweight nuclei, making for a very messy plasma soup. This problem was worse with lasers: light that is shined on a hunk of matter tends to heat the electrons first. This was a huge issue. The electrons in a laser target would get so hot that the target would explode before the nuclei got warmed up. Hot electrons and cold nuclei were no good for fusion-it was the nuclei that scientists really wanted to heat up.

For technical reasons, the bluer the laser beam, the smaller this effect. So the Livermore scientists shined the laser light through crystals that would make the infrared beam green or even ultraviolet.51 The color conversion worked well to reduce the heating of the electrons, but the process was inefficient. The beam lost some of its energy because of the color change. It also made the laser more expensive, as big, high-quality color-change crystals were not cheap. For a full-size machine, Shiva, that would use 20 beams to zap a pellet of deuterium from all directions. It would ignite the pellet, creating a fusion reaction that would generate as much energy as the laser poured in. Or so the scientists hoped. They were wrong by a factor of 10,000. Laser fusion scientists, like the magnetic fusion advocates that preceded them, were about to come face-to-face with a nasty instability.

Laser fusion is the equivalent of keeping water trapped in an upside-down glass. As you compress a pellet of deuterium, it becomes denser and denser. Long before you get it hot and dense enough to fuse, it will be much denser than whatever substance you are using to compress it, whether it is particles of light or a collection of hot atoms. You are using a less-dense substance to squash and contain a much denser one, and that means you will get Rayleigh-Taylor instabilities. Any tiny imperfections on the interface between the plasma and the stuff that is pushing on the plasma will immediately grow. Even an almost perfectly round sphere of deuterium will quickly become distorted, squirting tendrils in all directions. Just as this ruins any attempt to keep water in an inverted glass by means of air pressure, it seriously damages a machine’s ability to compress and contain a plasma by means of light. The only way around this was to make sure there were almost no imperfections. The target had to be perfectly smooth, and the compressing lasers had to illuminate the target completely uniformly, without any hot or cold spots that would lead to ever-growing Rayleigh-Taylor tendrils.

It was almost as if the laser scientists were trying to invert a glass so carefully that the surface of the water inside wouldn’t ripple at all. This is an extraordinarily difficult task. Even the 20-armed Shiva machine, heating the plasma from 20 different directions at once, wasn’t uniform enough to keep the Rayleigh-Taylor instabilities in check. The 20 pinpricks of laser light were far enough apart from one another that they would create hot spots in the target rather than heating it uniformly. The pellet would compress, getting hot and dense enough to induce a little bit of fusion, but before the reaction really got going, the Rayleigh-Taylor instability would take over. Tendrils would form. Instead of getting denser and hotter, the deuterium would squirt out.

Instead of using the lasers to push directly onto a dollop of deuterium, the new method did it indirectly. The pellet was ensconced at the center of a hollow cylinder known as a hohlraum. Instead of striking the pellet, the lasers struck the insides of the hohlraum. The hohlraum then radiated x-rays toward the pellet. This setup is known as indirect drive. But it didn’t do enough. Shiva, which had cost $25 million to build, only performed a fraction as well as its designers had hoped. It didn’t come close to producing as much energy from fusion as it took to run the lasers. The answer seemed within reach, though: just build a bigger Shiva, one with ten times the power, and ten times the price. By the beginning of the 1980s, Livermore was building a $200 million laser named Nova.

Even today, decades later, these two approaches-magnetic fusion and inertial confinement fusion-remain the ways that most scientists are trying to bottle up a tiny sun. But both methods are extremely expensive, and both are plagued with instabilities that threaten to destroy the dream of unlimited fusion energy.

Cold fusion (read the chapter, I can’t capture all the complexity and history

In September 2006 on the outskirts of Washington, DC, the Conference on Future Energy was a celebration of sorts. Its convener, Thomas Valone, had recently won a long legal battle with his employer, the U.S. Patent and Trademark Office. Valone was a patent examiner who had, in his view, been fired for his belief in cold fusion. Cold fusion had burst upon the world nearly two decades earlier and had long since been discredited by the mainstream scientific community. Yet today it still has a strong following, a core of true believers who think it will help humanity unleash unlimited power from fusing atoms.

Plenty of reporters, government officials, and even scientists remain under its spell. The dream of unlimited energy through cold fusion is so powerful that for almost twenty years the faithful have been willing to risk ridicule and isolation to follow it.

The biggest scientific scandal of the twentieth century began on March 23, 1989. Two chemists at the University of Utah, Martin Fleischmann and Stanley Pons, told the world that they had tamed the power of fusion energy at room temperature, bottling up a miniature star in a little hunk of metal. The university’s press release was full of enthusiasm: Pons and Fleischmann’s setup was supposedly making an end run around physics’ requirements for fusion. There was no attempt to heat the deuterium to millions of degrees or to compress it to high densities. The chemists merely took a little rod of palladium metal, plopped it in a jar full of deuterium-enriched water, and ran an electric current through it. Somehow, without the benefit of high temperature and high pressure, the deuterium atoms were fusing inside that metal.

Over the years, Pons and Fleischmann had published some papers that seemed ludicrous-such as one that involved highly unlikely reactions of nonreactive gases-but the two still maintained a good reputation. This is part of the reason that cold fusion got so much attention. Pons and Fleischmann were established scientists. The cold-fusion experiment was deceptively simple. At the heart of each “reactor” was a rod or a sheet made of palladium. Palladium is a whitish metal that shares numerous properties with platinum and nickel. Oddly, it is able to soak up enormous volumes of hydrogen-the tiny hydrogen atoms nestle between the atoms of palladium-so researchers had been studying the metal in hopes of coming up with a method for storing hydrogen in fuel cells. Pons and Fleischmann had long been intrigued by this hydrogen-sponging behavior. Perhaps the hydrogens were very crowded in the small spaces between the palladium atoms. Perhaps those spaces were so crowded that the hydrogen atoms were bumping into each other with great force. Perhaps, if the hydrogen was replaced with deuterium. At first, they spent their own money, about $100,000 for the first crude experiments. “Stan and I thought this experiment was so stupid we financed it ourselves,” Fleischmann said at the press conference.

MUON-CATALYZED FUSION: Ordinary atoms have large electron clouds (left) that make it hard for the nuclei to get close enough together to fuse. Replace the electrons with muons (right) and the muon cloud is much smaller; nuclei get together much more easily and are able to fuse at relatively low temperatures.

Muon-catalyzed fusion, as it came to be known, really was room-temperature fusion. If scientists could somehow replace the electrons in a jar full of hydrogen with muons, they would be able to get a fusion reaction without the need for immense heat and pressure; the muon hydrogens would fuse by virtue of their smaller size. Unfortunately, muons are hard to come by. To get them in large numbers, scientists need to build a particle accelerator. Accelerators consume lots of energy, and they are not very efficient.

Even if scientists found an efficient way of producing muons, the muons they would create would last only a few microseconds before decaying into electrons and a handful of other particles. If in those moments the scientists then successfully shot one of those muons into a cloud of hydrogen, they might get lucky and induce two atoms to fuse into helium, but what then? The muon can get trapped in the helium atom, and then it is useless. It will quickly decay without helping any other atoms to fuse. If every available muon catalyzed only one atomic fusion, then there is no hope of producing energy; merely creating the muons and delivering them would consume more energy than was released by that single fusion. If, on the other hand, a muon can escape the clutches of the helium nucleus, then helps another fusion to occur, escapes, helps another fusion, and so forth, then muon-catalyzed fusion would not be hopeless after all. If every muon induces a few hundred fusions before decaying, then perhaps it would be possible to generate more energy than the amount used to create the particles in the first place. Muon-catalyzed fusion would achieve breakeven.

When Alvarez first saw the phenomenon in deuterium, he was extremely excited. “We had a short but exhilarating experience when we thought we had solved all of the fuel problems of mankind,” he said in his Nobel Lecture a decade later. “While everyone else had been trying to solve the problem by heating hydrogen plasmas to millions of degrees, we had apparently stumbled on the solution, involving very low temperatures instead.” Unfortunately, as Alvarez’s team performed more detailed calculations, they concluded that the muons quickly got stuck in helium and decayed, and that muon-catalyzed fusion of deuterium would never lead to a practical energy source.

Jones trumpeted the potential of muon-catalyzed fusion in seminars, lectures, and papers, and cowrote a Scientific American article about it in 1987. “It is now conceivable that cold fusion may become an economically viable method of generating energy,” the article read,

A Swiss team, for example, performing similar experiments, was not seeing the same density effects that Jones was observing. Their muons got stuck in helium atoms fairly rapidly, as expected. Instead of seeing hundreds of fusions per muon, they were seeing tens. Muon-catalyzed fusion would never lead to breakeven at this rate. And as the Department of Energy’s money for muon-catalyzed fusion began to run out-the Division of Advanced Energy Projects had already spent more than $2 million-prospects for muon-catalyzed fusion began to dim.

Pons and Fleischmann’s work had much in common with Jones’s. Both were hoping to trap deuterium in a hunk of metal-particularly palladium-and force it to fuse somehow. If money was to be made from cold fusion (and if Pons and Fleischmann were correct, cold fusion would be a moneymaker unlike almost any other invention), only the patent holders would see huge benefits. Only the people who discovered cold fusion would be able to patent the process. And only the people to go public with their work first would be hailed as the discoverers. All the money, glory, and power that might come from the discovery of cold fusion hinged upon being the first to go public.

Jones, Pons, and Fleischmann had entered an ever-quickening race to run experiments, prove the existence of cold fusion, write a paper for a peer-reviewed journal, and publish it. By early 1989, the competitors had agreed to submit simultaneous papers to Nature, so they could all cross the finish line simultaneously. But in a climate of increasing mistrust and antagonism, Pons and Fleischmann jumped the gun. They submitted their paper to the Journal of Electroanalytical Chemistry on March 10, and within two weeks they were in front of the microphones, touting their achievement to the world-despite the improbability of what they had found.

Free-floating protons are relatively common, but free-floating neutrons are rarer, as are tritium and helium-3. So if you think that you’ve got deuterium-deuterium fusion going on in your laboratory, the best way to convince other people is to demonstrate that you are making tritium, helium-3, and neutrons. The neutrons, arguably, should be the easiest to detect. Neutrons penetrate matter very easily, so any neutrons produced by the reaction would quickly fly out through the walls of the beaker and into the walls surrounding the room. A neutron detector need only be placed next to the reactor vessel and it would certainly pick up some of these particles.

for every watt of power the cell produced, about a trillion neutrons should have been flying out every second. At the power levels Pons and Fleischmann were seeing, their beaker should have been emitting dangerous and easily detectable levels of radioactivity. But it wasn’t.

When Pons and Fleischmann announced their discovery to the world on March 23, 1989, Utahans immediately sought to capitalize on the news. The day after the press conference, Governor Norman Bangerter announced that he would call a special session of the legislature to appropriate $5 million for cold-fusion research. The appropriations bill passed overwhelmingly. The money would help establish a National Institute for Cold Fusion at Utah. Soon cold-fusion lobbyists would be marching up Capitol Hill seeking tens of millions of dollars, promising that Japan would steal cold-fusion momentum away from the United States if the nation didn’t invest immediately.

Edward Teller called to congratulate Pons and Fleischmann and started a Livermore task force to look into cold fusion. Others, including the University of Utah’s own physics department, which had been kept in the dark by the chemists, were extremely wary of the results. Pons and Fleischmann had held their press conference before publishing their data and their methods. This was very unusual. Scientists communicate through scientific presentations and papers, not through press releases and press conferences. On the relatively rare occasions that a scientific result is important enough to merit a press event, it is usually held at the same moment that the data are revealed to the scientific community through a paper or in a presentation. With the cold-fusion announcement, the paper was missing. No data were available, and scientists had only the scantest details about how Pons and Fleischmann performed their experiment. The suspense would last for months.

Privately, though, Pons and Fleischmann were getting bad news. Two days before the press conference, Fleischmann learned that even the hypersensitive neutron detector at Harwell wasn’t picking up anything. There was no trace of the trillions and trillions of neutrons that should have been flowing from the palladium. The biggest critics of cold fusion were plasma physicists. These were the people who knew a lot about the difficulty of achieving fusion, and who had learned through painful experience how neutrons can fool you. The lack of details from Pons and Fleischmann was frustrating physicists who were trying to confirm the cold-fusion experiments using data gleaned from television broadcasts and newspaper photographs.

If Pons and Fleischmann were actually seeing fusion in a test tube, they should have been able to show that the effect was not due to a quirk in their apparatus. To do this, they needed to run a control experiment-one that was almost identical to the fusion cell, but subtly different in a way that would prevent fusion from occurring. Only then could they prove that fusion was really responsible for the excess heat and other effects they were seeing. In the Pons and Fleischmann case, the obvious control experiment was to run an identical experiment with ordinary water rather than heavy, deuterium-laden water. If deuterium-deuterium fusion was responsible for the excess heat, getting rid of the deuterium and replacing it with ordinary hydrogen should end the fusion and turn the heating off. They then could be assured that the heat had something to do with the deuterium in the beaker. Doing this was absolutely necessary if Pons and Fleischmann were to prove to other scientists that they were not deluding themselves.

Indeed, this sort of control experiment is what budding scientists are taught to do in freshman science classes, and everybody expected it from such established scientists as Pons and Fleischmann-not to have run one would seem absurd.

The scientific community wanted to see the results of those control experiments, but neither the Journal of Electroanalytical Chemistry paper nor the one that Pons and Fleischmann submitted to Nature had any sign of such a control. “How is this astounding oversight to be explained to students. . . . And how should the neglect be explained to the world at large?” asked John Maddox, the editor of Nature.

The physics community was in an uproar. Pons and Fleischmann were too busy to revise their paper for Nature, too busy to respond to requests for clarification and information from skeptics, too busy to attend the upcoming American Physical Society (APS) meeting in Baltimore, but not too busy to hype their claims to Congress in hopes of grabbing $25 million of federal pork. The researchers were making ever more bizarre claims (such as the helium-4 detection) and getting increasingly defensive. In the view of most physicists, the pair had been evasive, self-contradictory, and perhaps less than honest. The mood in the physics community was poisonous. At the Baltimore meeting on May 1, it all erupted.

It was a mortal blow. To most mainstream scientists, cold fusion was dead. The New York Times’s obituary was a piece entitled “Physicists Debunk Claim of a New Kind of Fusion.” Even the Wall Street Journal admitted that the session had been a “devastating” attack on the Utah team’s credibility, but was less willing to give up hope for cold fusion. (Over the next few weeks, the stream of hopeful news-new confirmations and evidence in favor of cold fusion-continued gracing the pages of the Journal.) But to most scientists, cold fusion was well and truly dead, even though, as physicist Park noted, the corpse probably would “continue to twitch for a while.” (This was, as it turns out, an understatement.)

Positive reports from increasingly sketchy research kept dribbling in. These persuaded some scientists, as well as a number of mainstream organizations, including the Electric Power Research Institute and the Stanford Research Institute, that there had to be something to cold fusion. (As late as October 1989, Edward Teller apparently was in favor of funding cold-fusion experiments.)

And so the corpse of cold fusion continued to twitch.

The Japanese gave up on cold fusion in 1997, after having spent tens of millions of dollars without any concrete results.

Steven Jones, too, was driven to the fringe. Though he kept his post at Brigham Young University, his research got increasingly bizarre. A devout Mormon, he tried to prove that Jesus Christ had visited Mesoamerica (he thought that marks on the hands of Mayan gods were evidence that Christ, with his stigmata, was their inspiration). Then, in 2006, he came out with a study that purported to prove that the World Trade Center had been demolished by explosives inside the building, not by the jets that struck from the outside.

If Pons, Fleischmann, and Jones had been the only ones who supported cold fusion, the idea would have long since passed out of the public consciousness. But some serious-sounding scientists at some serious-sounding institutions were convinced that there had to be something to the cold-fusion claims. The cold-fusion movement also drew strength from the press. Reporters seem genetically predisposed to take the side of the underdog, and the cold-fusion-versus-big-science story certainly had one. Some journalists were true believers, and others just were offended by mainstream science’s treatment of the cold-fusion researchers. Their gripes came out as a slow and steady drumbeat. “These folks need a fair hearing,” said ABC News science correspondent Michael Guillen in 1994. In 1998, Wired’s Charles Platt suggested that ignoring new cold-fusion research might be “a colossal conspiracy of denial.”

The scientific community had scrutinized their claims. They found the Utah group’s work sloppy at best, and systematically demolished the chemists’ claims. Cold-fusion advocates had spent millions of dollars researching the phenomenon and still did not have a device that could reliably heat a cup of water for tea. The burden of proof, as always in science, is on the people who claim extraordinary things.

By 2004, the pressure had grown to the point that the Department of Energy felt it necessary to review whether cold fusion merited renewed funding. (The term cold fusion had been dropped in favor of the less-pejorative low-energy nuclear reactions.) The conclusions were much the same as they had been a decade and a half earlier.

Magnets and lasers are still the only ways scientists see to create fusion now

The more researchers experiment with fusion, the more most of them are convinced that the best-if not the only-way to create a fusion reactor is with a hot plasma, confined and compressed by some powerful force. Nowadays, that leaves only two realistic options: big, expensive magnets or big, expensive lasers. Both approaches require billions of dollars and thousands of scientists. And both have secrets. Laser fusion’s secret is a matter of national security; magnetic fusion’s secret is a matter of some embarrassment. Both secrets threaten the future of fusion energy.

Novafailed to achieve breakeven. The laser was certainly generating fusion reactions. By the mid-1980s it was achieving about ten trillion fusion neutrons with each shot. But again, the laser consumed one thousand to ten thousand times as much energy as the fusion reactions produced. Once more, LASNEX had failed, and the scientists’ optimistic expectations were crushed. This time, though, their failure had cost almost $200 million.

Instead of using lasers to ignite a pellet, the Halite/Centurion program used nuclear bombs. Though very little information is available about the Halite/Centurion experiments, some details have dribbled out. It appears that the tests used hohlraums-the little metal tubes that are crucial to indirect-drive laser fusion-containing target pellets. These hohlraums and pellets were placed deep underground, at various distances from nuclear bombs. When the bombs went off, they radiated x-rays in all directions. Some of those x-rays shined into the hohlraums, which reradiated x-rays toward the pellet, just as in a laser fusion experiment.

If the pro-laser-fusion scientists are to be believed, then Halite/Centurion showed that the laser fusion program is on the right track. Not everybody agrees. Apparently, the hohlraums in the Halite/ Centurion experiments received varying amounts of energy, from tens to hundreds of millions of joules, about a thousand times greater than the energy even the Nova laser would deliver. But even with that much energy driving them, 80 percent of the capsules failed to ignite,

LASNEX didn’t predict the failures. Mascheroni argues that the pro-laser-fusion lobby is hiding negative results behind a wall of secrecy; if outside scientists could see the data, he says, they would conclude that Halite/Centurion proved that the laser fusion program was failing miserably.

Who is correct? It’s a secret. Those scientists who have access to the data from Halite/Centurion can’t talk; it’s unlawful for them to make any details public. Those who don’t have access obviously can’t assess the arguments. It’s the big secret of laser fusion.

Magnetic fusion has the advantage of openness. You can read almost all the literature that has been written about it.

The big tokamak in the United States would be at Princeton: the Tokamak Fusion Test Reactor (TFTR), which promised to achieve breakeven. TFTR was supposed to cost a bit more than $300 million, but as often is the case with cutting-edge science projects, the expenditures ballooned well beyond that by the time the project was finished.

For a tokamak, the promised land is not just breakeven. It is known as “ignition and sustained burn.” Unlike laser fusion devices, which have to create individual bursts of fusion energy, a magnetic fusion device like a tokamak can, in theory, run nonstop, producing continuous energy. Once scientists are able to get their magnetic bottles strong enough, they will be able to exploit this and keep a fusion reaction running indefinitely. The fusion reactions in the belly of the tokamak should suffice to keep the plasma hot, so after they get it started, the reaction will essentially run itself. All the scientists have to do is periodically inject some more deuterium and tritium fuel into the reactor and remove the helium “ash” from the plasma. Once you figure that out, you’ve got an unlimited source of power.

By the time Ronald Reagan came into office, the climate for fusion was already changing. The OPEC crisis was fading into memory, and energy research was not a high priority for the new president. He scuttled Carter’s plan, and as budget deficits rose, fusion energy money began to disappear, $50 million hunks at a time. The panoply of glorious experiments planned in the 1970s began to crumble under increasing financial pressure. As magnetic fusion budgets dwindled, researchers struggled to save their precious tokamaks from the budget ax. A huge magnetic-mirror project that had already swallowed more than $300 million was scrapped just as it finished its eight-year construction and was about to be dedicated. 62 It never got turned on. One after another, new facilities-such as the “Elmo Bumpy Torus” and the “Impurity Studies Experiment”-died on the drawing board. The TFTR program was delayed, but not cancelled.

There was no way, with budgets as they were, that fusion scientists could ever hope to build a magnetic fusion reactor. A tokamak big enough and powerful enough to keep a plasma burning indefinitely would cost billions, and America’s fusion budget could never withstand that sort of strain. So the international fusion project, ITER was proposed, a real monster. As design work began on it, scientists realized that it would cost $10 billion. The four parties, working together, could cough up the money, but ITER would devour the fusion budgets of all the participating countries. Even the big tokamaks-TFTR, JET, JT-60-would not survive. Once the ITER project was under way, there would be no room in the budget for anything else. This was a big problem.

Princeton scientists did not want their facility to disappear. Other fusion researchers, especially those who thought that non-tokamak machines were still worth exploring, were angry that the world was going to gamble all its fusion money on a tokamak while ignoring all other possibilities. Almost everyone agreed that a big international reactor effort would be a wonderful thing, but at the same time everyone wanted to have a thriving domestic fusion program, too.

Where can we as a society get our energy? Fossil fuels pollute, cause global warming, and are running out. Renewable sources-solar, geothermal, wind-can’t provide nearly enough energy for an industrial society. That leaves nuclear energy: fusion or fission. Holt argued that fission is messy: a fission reactor uses up its fuel rods and leaves behind a radioactive mess that nobody knows how to dispose of. Fusion, on the other hand, leaves no harmful by-products. It runs on deuterium and tritium, he said, and leaves only harmless helium behind. Clean fusion energy would be a much better choice. This is the sales pitch of faithful magnetic fusion scientists everywhere. Fusion provides unlimited power-clean, safe energy without the harmful by-products of fission. But there is a dirty little secret. Fusion is not clean. Once again, it’s the fault of those darn neutrons.

Magnetic fields can contain charged particles, but they are invisible to neutral ones. Neutrons, remember, carry no charge and do not feel magnetic forces. They zoom right through a magnetic bottle and slam into the walls of the container beyond. Since a deuterium-deuterium fusion reaction produces lots of high-energy neutrons (one for every two fusions), the walls of a tokamak reactor are bombarded with zillions of the particles every moment it runs. Neutrons are nasty little critters. They are hard to stop: they whiz through ordinary matter rather easily. When they do stop-when they strike an atom in a hunk of matter-they do damage. They knock atoms about. They introduce impurities. A metal irradiated by neutrons becomes brittle and weak. That means the metal walls of the tokamak become susceptible to fracture before too long. Every few years, the entire reactor vessel, the entire metal donut surrounding the plasma, has to be replaced.

Neutrons also make materials radioactive. The neutrons hit the nuclei in a metal and sometimes stick, making the nucleus unstable. The longer a substance is exposed to neutrons, the “hotter” it gets with radioactivity. By the time a tokamak’s walls need to be replaced, they are quite hot indeed.

Though fusion scientists portray fusion energy as cleaner than fission, a fusion power plant would produce a larger volume of radioactive waste than a standard nuclear power plant. It would also be just as dangerous-at first. Much of the waste from a fusion reactor tends to “cool down” more quickly than the waste from a fission reactor, taking a mere hundred years or so until humans can approach it safely. But it means that humans will have to figure out where to store it in the meantime, as well as the rest of the waste that, like spent fission fuel, will remain untouchable for thousands of years. Fusion is a bit cleaner than fission, but it still presents a major waste problem.

Fusion scientists recognize this and are working on exotic alloys that are less affected by neutron bombardment, materials made of vanadium and silicon carbide. However, developing those materials is going to cost a lot of money, and they will still present a waste problem,

the Oak Ridge National Laboratory was making a claim that seemed eerily reminiscent of cold fusion. They claimed to have induced fusion reactions on a tabletop using a process that might lead to energy production. More important, they did it in an ingenious, and seemingly plausible, way. They did it with a technique linked to a mysterious phenomenon known as sonoluminescence. As early as the 1930s, scientists had discovered a bizarre method to convert sound into light. If you take a tub of liquid and bombard it with sound waves in the correct manner, the tub begins to generate tiny little bubbles that glow with a faint blue light. This phenomenon is not perfectly understood, but scientists are pretty sure they know what is going on, at least in gross terms. If you have ever belly flopped off a diving board, you know that a liquid like water doesn’t always behave quite like a fluid. Hit it hard enough and fast enough, faster than the water can flow out of your way, and it feels almost like concrete. It behaves more like a solid than like a liquid. This is more than a mere metaphor. Under certain circumstances-if you hit a liquid in the right way-it will “crack” just as a solid would. The liquid ruptures, creating tiny vacuum-filled bubbles that instantly fill with a tiny bit of evaporated liquid.

Under the right conditions, the sound waves reverberating through the liquid also cause these bubbles to compress and expand, compress and expand. Each time the bubbles are squashed by the sound waves, they heat up. If the sound waves are just right, the bubble can collapse to roughly one-tenth its original size, heating up to tens of thousands of degrees and emitting a flash of light. This is sonoluminescence.

The first problem he encountered was that tens of thousands of degrees isn’t nearly enough to induce fusion, so ordinary sonoluminescence didn’t have any hope of getting deuterium nuclei to stick together. For fusion, Taleyarkhan needed to heat deuterium and to tens of millions of degrees, a thousand times hotter than what traditional sonoluminescence could achieve. The only way to get those temperatures was to compress the bubbles far more than had ever been done before, either by squashing them tighter or by starting with bigger bubbles. Taleyarkhan had figured out an innovative way to do the latter. To all appearances, Taleyarkhan and his colleagues did all the right things when they went looking for deuterium-deuterium fusion. The paper told of how the researchers looked for neutrons-and found them. Tritium? Found it. They also avoided many of Pons and Fleischmann’s mistakes. They ran the obvious control experiments, substituting ordinary acetone for the deuterated variety. The neutrons and tritium disappeared.

But I was skeptical. For one thing, I knew Taleyarkhan, and while I held him in reasonably high esteem, I didn’t think of him as a fusion expert. What really bothered me, though, were the neutrons. The bubble fusion paper was going to live or die by the neutrons Taleyarkhan was claiming to see. Neutrons were what killed Pons and Fleischmann. Neutrons were what killed ZETA. Without a nice, clear demonstration of neutrons of the proper energy-2.45 MeV-streaming from the experimental cell, nobody would take Taleyarkhan seriously for a minute. So the first thing I looked at was the paper’s graph of neutrons. I was surprised. Skeptical physicists would only be convinced by a detailed graph showing how many neutrons were detected at what sorts of energies. Taleyarkhan’s paper had a few graphs, but they were far from detailed. The main one only had four points-two for the deuterium experiment and two for the control experiment-telling how many neutrons were detected above and below 2.5 MeV. That wasn’t nearly enough,

Without that level of detail, I didn’t think that there was enough information to determine whether the experimenters were seeing something real. I was uneasy. The content of the graph did not rule out the claim of fusion. Taleyarkhan’s team may well have seen neutrons that were drop-dead evidence of fusion. But if they did, I couldn’t tell from the graph. If they had confirmatory data, they were not presenting it in a convincing way. If they didn’t know how to convince other scientists of their claims, I suspected that they didn’t know enough about the field to make such claims in the first place.

Calling for other scientists to repeat an experiment before publication was an extremely unusual step, and it likely struck Taleyarkhan as a vote of no confidence, but Oak Ridge insisted. The lab seemed determined to avoid becoming the center of another cold-fusion fiasco.

When Shapira and Saltmarsh analyzed the data they had gathered, the results were damning. They found no sign of fusion, no evidence for neutron emission from the bubbling deuterated acetone. They did not try to verify Taleyarkhan’s findings of tritium, but noted that if the tritium had been produced by fusion, the bubbling solution should have produced a million neutrons per second, and that level of activity should easily have been picked up by the neutron detector. According to their equipment, though, nothing was happening in the bubbling liquid, just the expected number of chirps caused by stray neutrons produced by cosmic rays and the like.

Bubble fusion, like cold fusion, was steadily driven to the fringe of science. Though bubble fusion started out at the core of establishment science, it ended as sordidly as the cold-fusion fiasco had. The scientific community moved quickly from mere skepticism to accusations of fraud. Bubble fusion, like cold fusion, imploded under charges of fraud and scientific misconduct. Though both methods still have their supporters, both have now been swept to the fringes of science.

Hot fusion now enjoys a monopoly. Mainstream scientists who hope for fusion energy almost unanimously pin their hopes upon inertial confinement fusion or magnetic fusion.

ITER’s trouble began at birth. Nobody had ever pulled off an international scientific project of such an enormous scale. Figuring out how to compress and ignite a plasma was only one of the problems that ITER proponents had to solve. Perhaps even trickier was the problem of distribution and containment of pork. Politicians like to see direct benefits from the money they spend. This means they want cash to flow into the hands of the people who elect them. New Mexico congressmen tend to be munificent to Los Alamos; California senators back Livermore; New Jersey politicians support Princeton. It’s similar in other countries. ITER provided a porky dilemma. No matter where the ITER partners put the reactor, three of the four parties were going to have to spend their money on a machine in another country. Even if these partners managed to build much of the equipment domestically, cash (and talent) would have to flow overseas. This isn’t good pork-barrel politics. The country where the reactor would be built would get the lion’s share of the benefits of the project, and the others would see their money flow into the hands of a rival.

Rather than consolidating multiple international efforts into one big project, the need to distribute the pork among the parties led to just the opposite: duplication of effort. There were three centers-one in Germany, one in Japan, and one in the United States-devoted to designing the reactor. The tokamak shouldn’t be the only game in town. Thus, they were against ITER. They didn’t want to wager everything on a single enormous tokamak. Moreover, they weren’t alone in their wariness of the international reactor. Even tokamak physicists felt threatened, because the domestic fusion program would have to be gutted in favor of the enormous international collaboration.

By 1995, the magnetic fusion budget had been hovering around $350 million per year. The President’s Committee of Advisors on Science and Technology (PCAST), an independent panel of experts that counseled the president on all matters scientific, gave Bill Clinton a grave warning about the fusion budget. At $320 million per year, the domestic program would be crippled, and ITER-as planned-would be too expensive to support;

A demonstration fusion power plant would be at least forty years away.

The committee tried to envision a worthwhile fusion program with lower levels of funding but came to the following conclusion: We find that this cannot be done. Reducing the U.S. fusion R&D program to such a level would leave room for nothing beyond the core program of theory and medium-scale experiments … no contribution to an international ignition experiment or materials test facility, no [new domestic tokamak], little exploitation of the remaining scientific potential of TFTR, and little sense of progress toward a fusion energy goal. With complete U.S. withdrawal, international fusion collaboration might well collapse-to the great detriment of the prospects for commercializing fusion energy as well as the prospects for future U.S. participation in major scientific and technological collaborations of other kinds.

When Congress passed the 1996 budget, magnetic fusion got about $240 million. It did not take long for things to unravel completely. In the meantime, the projected costs for ITER were skyrocketing, and scientists raised new doubts about whether it would achieve ignition at all. some physicists predicted that new instabilities would cool the plasma faster than expected, meaning ITER would fail, just as generations of fusion machines had failed before

physicists began to argue, domestic devices could fail just as well at half the price. The American scientists (as well as their Japanese counterparts, who were also cash strapped) started talking about scaling it back, making it into a less-ambitious experiment at a lower cost. ITER-Lite, as the plan was known, would only cost $5 billion. However, ITER-Lite would be unable to achieve ignition and sustained burn. It would be just another incremental improvement on existing devices. Though ITER-Lite was cheaper, it would defeat the main benefit of pooling four countries’ resources. No longer would the countries be leapfrogging over what domestic programs had been able to accomplish on their own. ITER-Lite would not be a great advance over previous designs. It would just be a massive, more expensive version of what everyone else had already built.

Laser fusion scientists didn’t suffer nearly as much in the 1990s as their magnetic fusion counterparts. As magnetic fusion budgets sank, laser fusion ones rose, because laser fusion scientists had a secret weapon: nuclear bombs. They weren’t really going after unlimited energy: they were pursuing laser fusion as a matter of national security. Without a working laser fusion facility, they argued, America’s nuclear weapons arsenal would be in grave danger. Congress was sold. Of course, nuclear testing was the way weapons designers evaluated their new warheads; no nuclear testing means no new types of nuclear warheads-more or less.

it is certain that any sizable design change wouldn’t be considered reliable until it was subjected to a full-scale nuclear test. It’s not a huge problem if the United States can’t design new nuclear weapons; the ones on hand are sufficient for national security.

Without periodic nuclear testing, weaponeers argued, they could not be certain that the weapons in the nuclear stockpile would work. Nuclear bombs, like any other machines, decay over time. Their parts age and deteriorate. Since nuclear weapons use exotic radioactive materials, which undergo nuclear decay as well as physical decay, engineers don’t have a firm understanding of how such a device ages.

Weapons scientists assured federal officials that with a set of high-tech experimental facilities they could ensure the reliability of the nation’s arsenal. Some facilities would concentrate on the chemical explosives that set off the devices. Some would study how elements like plutonium and uranium respond to shocks. But the jewel in the stockpile stewardship’s crown would be NIF, the National Ignition Facility at the Lawrence Livermore National Laboratory.

As late as June 1999, NIF managers swore to the Department of Energy that everything was peachy, that the project, which was scheduled to be finished in 2003, was on budget and on schedule. This was a lie. Within a few months, officials at Livermore had to admit to enormous problems and cost overruns. To have any hope of achieving ignition, NIF’s target pellets-about a millimeter in size-cannot have bumps bigger than fifty nanometers high.

It’s a tough task to manufacture such an object and fill it with fuel. Plastics, such as polystyrene, are relatively easy to produce with the required smoothness, but they don’t implode very well when struck with light. Beryllium metal implodes nicely, but it’s hard to make a metal sphere with the required smoothness.

NIF wouldn’t be terribly useful for stockpile stewardship without achieving breakeven. And NIF’s contribution to stockpile stewardship is crucial for… what, exactly? It’s hard to say for sure. Assume that NIF achieves ignition. For a brief moment, it compresses, confines, and heats a plasma so that it fuses, the fusion reaction spreads, and it produces more energy than it consumes. How does that translate into assuring the integrity of America’s nuclear stockpile?

At first glance, it is not obvious how it would contribute at all. Most of the problems with aging weapons involve the decay of the plutonium “pits” that start the reaction going. Will the pits work? Are they safe? Can you remanufacture old pits or must you rebuild them from scratch? These issues are relevant only to a bomb’s primary stage, the stage powered by fission, not fusion (except for the slight boost given by the injection of a little fusion fuel at the center of the bomb). The fusion happens in the bomb’s secondary stage, and there doesn’t seem to be nearly as much concern about aging problems with a bomb’s secondary. If the primary is where most of the problems are, what good does it do to study fusion reactions at NIF? NIF’s results would seem to apply mostly to the secondary, not the primary.

NIF would help a bit with understanding what happens when tritium in a primary’s booster decays. (However, since tritium has a half-life of only twelve years, it stands to reason that weapons designers periodically must replace old tritium in weapons with fresh tritium. This is probably routine by now.)

NIF will also help scientists understand the underlying physics and “benchmark” the computer codes-like LASNEX-that simulate imploding and fusing plasma. (But why is this important if you are not designing new weapons? The ones in the stockpile already presumably work just fine, so you presumably don’t need a finer understanding of plasma physics to maintain them.)

“NIF will contribute to training and retaining expertise in weapons science and engineering, thereby permitting responsible stewardship without further underground tests.” That’s the main reason for NIF. NIF is essentially a training ground for weapons scientists. As old ones retire and new ones grow up without ever having seen a nuclear test, NIF is a way to give them some level of experience so that America doesn’t lose its nuclear expertise.

NIF isn’t truly about energy. It is not about keeping our stockpile safe, at least not directly. It is about keeping the United States’ weapons community going in the absence of nuclear tests.

We see what we want to see. That is why science was invented. Science is little more than a method of tearing away notions that are not supported by cold, hard data. It forces us to discard ideas that we cherish. It eliminates some of our hopes, some of our dreams, and some of our wishes. This is why science can be so soul crushing to even its most devoted adherents.

NIF’s design, particularly its slow lasers that need to cool for hours between shots, suggests that researchers will have to move to an entirely different type of laser system to have any hope of a practical energy source. ITER will never achieve ignition and sustained burn, the hallmark of a successful magnetic fusion reactor.

Even with mass production, each fusion power plant will probably cost many billions of dollars.

Fission plants are expensive, but they are likely to be considerably cheaper than their fusion counterparts. Fission plants are more dangerous than fusion plants (the fission reaction can get out of control, and a fusion reaction almost certainly won’t), and malefactors can process spent fuel rods to get materials for atom bombs.

This entry was posted in Fusion, Nuclear Books, Nuclear Waste and tagged , , , , , , , , . Bookmark the permalink.

Comments are closed.