Preface. This book has been on my reading list for several years now, but I have to admit I am lazy, lazy – the 656 pages is twice the length of most books. And it would be hard to write a better review than this…
It is sheer luck that WWIII or nuclear explosins haven’t happened yet by accident, by miscalculation of the other side’s intentions, bombs dripped by mistake, bombers crashing, or computers miscalculating. There were 1200 nuclear weapons alone between 1950 and 1968 involved in significant accidents.
I heard McNamara speak at U.C. Berkeley after the movie “Fog of War” by Errol Morris was shown. He said it’s up to us to do what we can to stop nuclear proliferation, and indeed it seems as important as any other cause you might choose to get involved in, and especially since it has more potential than climate change to drive humans and other life extinct.
Alice Friedemann www.energyskeptic.com author of “When Trucks Stop Running: Energy and the Future of Transportation”, 2015, Springer and “Crunch! Whole Grain Artisan Chips and Crackers”. Podcasts: Practical Prepping, KunstlerCast 253, KunstlerCast278, Peak Prosperity , XX2 report
***
Louis Menand. September 30, 2013. Nukes of Hazard. Eric Schlosser’s “Command and Control. the New Yorker.
On January 25, 1995, at 9:28 a.m.Moscow time, an aide handed a briefcase to Boris Yeltsin, the President of Russia. A small light near the handle was on, and inside was a screen displaying information indicating that a missile had been launched four minutes earlier from somewhere in the vicinity of the Norwegian Sea, and that it appeared to be headed toward Moscow. Below the screen was a row of buttons. This was the Russian “nuclear football.” By pressing the buttons, Yeltsin could launch an immediate nuclear strike against targets around the world. Russian nuclear missiles, submarines, and bombers were on full alert. Yeltsin had forty-seven hundred nuclear warheads ready to go.
The Chief of the General Staff, General Mikhail Kolesnikov, had a football, too, and he was monitoring the flight of the missile. Radar showed that stages of the rocket were falling away as it ascended, which suggested that it was an intermediate-range missile similar to the Pershing II, the missile deployed by NATO across Western Europe. The launch site was also in the most likely corridor for an attack on Moscow by American submarines. Kolesnikov was put on a hot line with Yeltsin, whose prerogative it was to launch a nuclear response. Yeltsin had less than six minutes to make a decision.
The Cold War had been over for four years. Mikhail Gorbachev had resigned on December 25, 1991, and had handed over the football and the launch codes to Yeltsin. The next day, the Soviet Union voted itself out of existence. By 1995, though, Yeltsin’s popularity in the West was in decline; there was tension over plans to expand NATO; and Russia was bogged down in a war in Chechnya. In the context of nuclear war, these were minor troubles, but there was also the fact, very much alive in Russian memory, that seven and a half years earlier, in May, 1987, a slightly kooky eighteen-year-old German named Mathias Rust had flown a rented Cessna, an airplane about the size of a Piper Cub, from Helsinki to Moscow and landed it a hundred yards from Red Square. The humiliation had led to a mini-purge of the air-defense leadership. Those people did not want to get burned twice.
After tracking the flight for several minutes, the Russians concluded that its trajectory would not take the missile into Russian territory. The briefcases were closed. It turned out that Yeltsin and his generals had been watching a weather rocket launched from Norway to study the aurora borealis. Peter Pry, who reported the story in his book “War Scare” (1999), called it “the single most dangerous moment of the nuclear missile age.” Whether it was the most dangerous moment or not, the weather-rocket scare was one of hundreds of incidents after 1945 when accident, miscommunication, human error, mechanical malfunction, or some combination of glitches nearly resulted in the detonation of nuclear weapons.
During the Cold War, there were a few occasions, such as the Cuban missile crisis, in 1962, when one side or the other was close to a decision that was likely to start a nuclear war. There were also some threats to go nuclear, though they were rarely taken completely seriously. In 1948, during a dispute with the Soviets over control of Berlin, Harry Truman sent B-29s to England, where they would be in range of Moscow. They were not armed with atomic bombs, but they were intended as a signal that the United States would use atomic weapons to defend Western Europe.
In 1956, during the Suez crisis, Nikita Khrushchev threatened to attack London and Paris with missiles if Britain and France did not withdraw their forces from Egypt. And, in 1969, Richard Nixon ordered B-52s armed with hydrogen bombs to fly routes up and down the coast of the Soviet Union—part of his “madman theory,” a strategy intended to get the North Vietnamese to believe that he was capable of anything, and to negotiate for peace. (The madman strategy was no more effective than anything else the United States tried, short of withdrawal, in the hope of bringing an end to the Vietnam War.)
But most of the danger that human beings faced from nuclear weapons after the destruction of Hiroshima and Nagasaki had to do with inadvertence—with bombs dropped by mistake, bombers catching on fire or crashing, missiles exploding, and computers miscalculating and people jumping to the wrong conclusion. On most days, the probability of a nuclear explosion happening by accident was far greater than the probability that someone would deliberately start a war.
In the early years of the Cold War, many of these accidents involved airplanes. In 1958, for example, a B-47 bomber carrying a Mark 36 hydrogen bomb, one of the most powerful weapons in the American arsenal, caught fire while taxiing on a runway at an airbase in Morocco. The plane split in two, the base was evacuated, and the fire burned for two and a half hours. But the explosives in the warhead didn’t detonate; that would have set off a chain reaction. Although the King of Morocco was informed, the accident was otherwise kept a secret.
Six weeks later, a Mark 6 landed in the back yard of a house in Mars Bluff, South Carolina. It had fallen when a crewman mistakenly grabbed the manual bomb-release lever. The nuclear core had not been inserted, but the explosives detonated, killing a lot of chickens, sending members of the family to the hospital, and leaving a thirty-five-foot crater. Although it was impossible to keep that event a secret, the Strategic Air Command (sac), which controlled the airborne nuclear arsenal, informed the public that the incident was the first of its kind. In fact, the previous year, a hydrogen bomb, also without a core, had been accidentally released near Albuquerque and exploded on impact.
Soon after the successful Soviet launch of Sputnik, in 1957, missiles became the preferred delivery vehicle for nuclear warheads, but scary things kept happening. In 1960, the computer at the North American Air Defense Command (NORAD) in Colorado Springs warned, with 99.9-per-cent certainty, that the Soviets had just launched a full-scale missile attack against North America. The warheads would land within minutes. When it was learned that Khrushchev was in New York City, at the United Nations, and when no missiles landed, officials concluded that the warning was a false alarm. They later discovered that the Ballistic Missile Early Warning System at Thule Airbase, in Greenland, had interpreted the moon rising over Norway as a missile attack from Siberia.
In 1979, NORAD’s computer again warned of an all-out Soviet attack. Bombers were manned, missiles were placed on alert, and air-traffic controllers notified commercial aircraft that they might soon be ordered to land. An investigation revealed that a technician had mistakenly put a war-games tape, intended as part of a training exercise, into the computer. A year later, it happened a third time: Zbigniew Brzezinski, the national-security adviser, was called at home at two-thirty in the morning and informed that two hundred and twenty missiles were on their way toward the United States. That false alarm was the fault of a defective computer chip that cost forty-six cents.
A study run by Sandia National Laboratories, which oversees the production and security of American nuclear-weapons systems, discovered that between 1950 and 1968 at least twelve hundred nuclear weapons had been involved in “significant” accidents. Even bombs that worked didn’t work quite as planned. In Little Boy, the bomb dropped on Hiroshima on August 6, 1945, only 1.38 per cent of the nuclear core, less than a kilogram* of uranium, fissioned (although the bomb killed eighty thousand people). The bomb dropped on Nagasaki, three days later, was a mile off target (and killed forty thousand people). A test of the hydrogen bomb in the Bikini atoll, in 1954, produced a yield of fifteen megatons, three times as great as scientists had predicted, and spread lethal radioactive fallout over hundreds of square miles in the Pacific, some of it affecting American observers miles away from the blast site.
These stories, and many more, can be found in Eric Schlosser’s “Command and Control” (Penguin), an excellent journalistic investigation of the efforts made since the first atomic bomb was exploded, outside Alamogordo, New Mexico, on July 16, 1945, to put some kind of harness on nuclear weaponry. By a miracle of information management, Schlosser has synthesized a huge archive of material, including government reports, scientific papers, and a substantial historical and polemical literature on nukes, and transformed it into a crisp narrative covering more than fifty years of scientific and political change. And he has interwoven that narrative with a hair-raising, minute-by-minute account of an accident at a Titan II missile silo in Arkansas, in 1980, which he renders in the manner of a techno-thriller:
Plumb watched the nine-pound socket slip through the narrow gap between the platform and the missile, fall about seventy feet, hit the thrust mount, and then ricochet off the Titan II. It seemed to happen in slow motion. A moment later, fuel sprayed from a hole in the missile like water from a garden hose.
“Oh man,” Plumb thought. “This is not good.”
“Command and Control” is how nonfiction should be written.
Schlosser is known for two popular books, “Fast Food Nation,” published in 2001, and “Reefer Madness,” an investigative report on black markets in marijuana, pornography, and illegal immigrants that came out in 2003. Readers of those books, and of Schlosser’s occasional writings in The Nation, are likely to associate him with progressive politics. They may be surprised to learn that, insofar as “Command and Control” has any heroes, those heroes are Curtis LeMay, Robert McNamara, and Ronald Reagan (plus an Air Force sergeant named Jeff Kennedy, who was involved in responding to the wounded missile in the Arkansas silo). Those men understood the risks of just having these things on the planet, and they tried to keep them from blowing up in our faces.
Until the late nineteen-sixties, nuclear rhetoric was far ahead of nuclear reality. In 1947, two years after the war in Europe ended, the United States had a hundred thousand troops stationed in Germany, and the Soviet Union had 1.2 million. Truman saw the atomic bomb as a great equalizer (the Soviets had not yet developed one), and he allowed Stalin to understand that the United States would use it to stop Soviet aggression in Western Europe. Truman was subsequently startled to find out from the head of the Atomic Energy Commission, David Lilienthal, that the United States had exactly one atomic bomb in its stockpile. The bomb was unassembled, but Lilienthal thought that it could probably be made operative.
It was during the Eisenhower Administration that nuclear weapons became the centerpiece of American military planning. Eisenhower thought that the defense budget was out of control, and building nuclear bombs is cheaper than maintaining a large conventional armed force. His Administration also believed that the doctrine of “massive retaliation”—the promise to meet Soviet aggression with an overwhelming nuclear response—was a deterrent that would keep the peace.
When John F. Kennedy ran for President, in 1960, he charged the Eisenhower Administration with having permitted a “missile gap” to develop between the United States and the Soviet Union—an issue that may have helped Kennedy win a very close election. But, as Eisenhower knew from spy-plane reconnaissance, there was no missile gap in the Soviets’ favor. In 1960, the Soviet Union had just four confirmed intercontinental ballistic missiles. And although Air Force intelligence informed Kennedy, after he took office, that the Soviets might have a thousand ICBMs by the middle of 1961, by the end of that year they had sixteen. In 1962, the Soviet Union had about thirty-three hundred nuclear weapons in its arsenal, and the United States had more than twenty-seven thousand. The Soviets had 36 ICBMs; the Americans had 203.
Soviet nuclear capability was regularly exaggerated by American intelligence in the 1950s, and it was in the interest of the armed services, and particularly the Air Force (not a hero in Schlosser’s story), not to correct the record. For more than ten years, the American government poured money into the manufacture of nuclear weapons, the American public was regularly frightened by warnings about the dangers of a nuclear attack that was always made to appear imminent, and defense intellectuals produced papers and books in which they thought about the unthinkable—how to prepare for, how to avoid, and how to survive a nuclear war.
The threat was largely, although not completely, imaginary. The Soviets didn’t have the capability that nuclear-war scenarios assumed, and there was no good reason to believe that anyone’s nuclear weapons would work the way they were designed to. The Kennedy Administration estimated that seventy-five per cent of the warheads on Polaris missiles (the missiles carried in submarines) would not detonate.
Even the war plans were flawed. An atomic explosion kills by shock waves, by radioactive fallout, and by fire. But, as Lynn Eden explained in “Whole World on Fire” (2004), American military planners never took fire into account when they made estimates of bomb damage. They therefore systematically underestimated the projected effects of nuclear bombing, and that led to the production of far more warheads than anyone needed.
But the threat, even though partly imagined, permitted the military to compile an arsenal that forced the Soviets to compile an arsenal to match it—and thereby to make the threat real. By the early 1970s, the Soviet Union had more long-range missiles than the United States did. By then, the public was no longer transfixed by the spectacle of imminent nuclear war, but the world was a far more dangerous place than it had been in the years of civil-defense exercises and back-yard fallout shelters.
Schlosser’s story brings out the pas-de-deux character of Cold War relations, the habit each side had of copying whatever move the other side had just made. Every strategic advantage was answered with its double. The reason the United States wanted nuclear superiority was not to knock out the Soviet Union but to keep the peace: it wanted the Soviet Union to know that if it ever started a nuclear war it would lose. The Soviets, unsurprisingly, saw the matter differently, so, every time the United States did something that gave it an edge, the Soviets responded, and the edge vanished. The search for stability was inherently destabilizing.
When the United States, in the 1950s, cut back on conventional forces in order to rely on nukes, for example, the Soviets did the same. The Warsaw Pact was the Soviet version of NATO. After the United States created the Strategic Air Command and made it the spearhead of the country’s military power, the Soviets created the Strategic Rocket Forces. When the United States developed the capacity to survive a first strike, the Soviets did the same. The monkeys chased each other up the tree.
The pattern was true even of Cold War domestic policy. In 1947, Truman created, by executive order, a loyalty program for federal employees. A week later, the Central Committee of the Communist Party established the Soviet honor courts, charged with investigating Western influences on Soviet life. The House Un-American Activities Committee began investigating Communists in Hollywood at the same time that Stalin and his cultural commissar, Andrei Zhdanov, started cracking down on artists and writers.
Every move intended to prevent a deliberate nuclear war therefore ended up increasing the risk of an accidental one. Schlosser’s point is not that there was some better way to run a Cold War. It is that the more extensive, elaborate, and fine-tuned the nuclear-weapons system became, the greater its exposure to the effects of an accident. For the system to work—for the warnings to be timely, communications to be transparent, missiles to launch, explosives inside the warheads to detonate, and nuclear cores to fission—everything has to be virtually perfect. The margin for error is tiny. And nothing is perfect.
Schlosser cites Charles Perrow’s “Normal Accidents” (1984) as an inspiration for his book. Perrow argued that in systems characterized by complex interactions and by what he called “tight coupling”—that is, processes that cannot readily be modified or turned off—accidents are normal. They can be expected. And they don’t lend themselves to very satisfying postmortems, since it is often difficult to explain just what stage it was in the cascade of bad events that made them irreversible.
Who was at fault in the Norwegian weather-rocket scare? The Norwegians had, in fact, notified the Russians several weeks in advance of the launch. They hadn’t specified a day, because the launch would depend on weather conditions. Either that notice was sent to the wrong parties in Russia or (which seems more likely) whoever received the notice didn’t grasp the implications or simply forgot to forward it to military authorities.
A mis-sent message is one of the most common errors in the world. Schlosser reminds us that during the Cuban missile crisis messages to Moscow from the Soviet Ambassador in Washington were written by hand and given to a Western Union messenger on a bicycle. “We at the Embassy could only pray,” the Ambassador, Anatoly Dobrynin, later said, “that he would take it to the Western Union office without delay and not stop to chat on the way with some girl.” (It was because of this that, after the crisis was over, the hot line linking the White House and the Kremlin was installed.)
And so, for six minutes in 1995, the future of the species hung in the balance because a mid-level Russian official left work early, or neglected to find a proper procedure for dealing with a message that someone was sending up a rocket, at an unspecified time, to look at the northern lights. It’s like the 46-cent computer chip. There was no redundancy built into the system. If one piece failed, the whole system was imperiled.
The Arkansas incident, in 1980, is well chosen as an illustration of Schlosser’s point. Objects fall inside silos all the time, he says. The chance that a falling socket would puncture the skin of a Titan II missile was extremely remote—but not impossible. When it happened, it triggered a set of mechanical and human responses that quickly led to a nightmare of confusion and misdirection. Once enough oxidizer leaked out and the air pressure inside the tank dropped, the missile would collapse, the remaining oxidizer would come into contact with the rocket fuel, and the missile would explode. Because a nineteen-year-old airman performing regular maintenance accidentally let a socket slip out of his wrench, a Titan II missile became a time bomb, and there was no way to turn off the timer.
And the missile was armed. Schlosser says that the explosive force of the warhead on a Titan II is nine megatons, which is three times the force of all the bombs dropped in the Second World War, including the atomic bombs that destroyed Hiroshima and Nagasaki. If it had detonated, most of the state of Arkansas would have been wiped out.
Few systems are more tightly coupled than the arsenal controlled by the nuclear football. Once the launch codes are entered, a chain of events is set in motion that is almost impossible to interrupt. The “Dr. Strangelove” scenario is quite realistic. The American nuclear-war plan, known as the Single Integrated Operational Plan (SIOP), provided for only one kind of response to an attack: full-scale nuclear war. It was assumed that tens of millions of people would die. There were no post-attack plans. For forty years, this was the American nuclear option. No doubt, the Soviets’ was identical.
Henry Kissinger called the SIOP a “horror strategy.” Even Nixon was appalled by it. Schlosser says that when General George Butler became the head of the Strategic Air Command, in 1991, and read the SIOP he was stunned. “This was the single most absurd and irresponsible document I had ever reviewed in my life,” he told Schlosser. “I came to fully appreciate the truth. . . . We escaped the Cold War without a nuclear holocaust by some combination of skill, luck, and divine intervention, and I suspect the latter in greatest proportion.”
The dangerous people in Schlosser’s story are the people who try to enhance the readiness of nuclear weaponry by reducing the controls on its use. The good people are not the anti-nuke activists. Schlosser is quite dismissive of them, especially the Western Europeans who protested against the Pershing IIs intended to protect them but not against the Soviet missiles right across the border that were aimed at them night and day.
Schlosser’s good people bring order to the system of nuclear armaments or try to find means of limiting its potential effects. When Curtis LeMay became the head of sac, in 1948, the United States was already committed to an announced policy of resisting Communist aggression anywhere in the world—the Truman Doctrine—and to using the threat of atomic weapons as a deterrent. But LeMay found sac to be a lax, undisciplined, and underequipped organization. Training was poor and security measures were almost nonexistent.
LeMay had commanded a bomber group in the Second World War, flying in the lead plane, and his toughness was legendary. He thought the term “limited war” was an oxymoron. His theory of war was that if you kill enough people on the other side they will stop fighting. He fired the top officers at sac and instituted a rigid system of rules and procedures, checklists and practice runs, and turned sac into a model of efficiency. Schlosser suggests that these reforms saved many lives.
Schlosser notes with some regret that LeMay became a symbol of military buffoonery after George C. Scott portrayed him as General Buck Turgidson, in “Dr. Strangelove,” and that he then made a mistake by running for Vice-President, in 1968, on a ticket with the segregationist George Wallace. At a press conference, LeMay declined to rule out the use of nuclear weapons in Vietnam. This position was consistent with his view that war must always be all-out, and, a year later, Nixon sent a signal that he was willing to use hydrogen bombs against the North Vietnamese. But Americans had lost their tolerance for nuclear brinkmanship. This was Strangelove talk.
Schlosser thinks that although Robert McNamara, too, had become one of the most despised figures in American politics by the time he resigned as Lyndon Johnson’s Secretary of Defense, in 1968, he had worked hard to limit the use of nuclear weapons. He had improved American early-warning systems; he had tried, with minimal success, to revise the siop; and he worked to have the Soviets understand that the United States would attack only military targets, encouraging them to do the same. But Vietnam brought him down.
Schlosser is careful not to give Ronald Reagan too much credit for defusing the arms race. He thinks that Reagan’s offer to eliminate all nuclear weapons during his famous summit meeting with Gorbachev in Reykjavik, in 1986, was partly a response to changes in American public opinion regarding nukes. But he also thinks that, although Reagan’s offer went nowhere (because he refused to cancel the Strategic Defense Initiative, the anti-missile system known as Star Wars), Reykjavik was “a turning point in the Cold War.” It convinced Gorbachev that the United States would not attack the Soviet Union, which enabled him to pursue his reform agenda, and eventually led to the removal of all intermediate-range missiles from Western Europe.
David Holloway, a historian of the period, once raised the question whether the nuclear arms race was a product of the Cold War or a cause. The bomb is inextricable from Cold War history because it was present at the very start. Truman’s principal reason for deciding to drop the bomb on Japan was to bring the war in the Pacific to a quick end, but his secondary one was to erect a psychological obstacle to any Soviet plans for postwar expansion. He wanted the Soviets to understand that the United States had no qualms about answering aggression with atomic weapons. (Ending the war quickly was itself a way to prevent the Soviets from acquiring territory in the Pacific while fighting was under way there, and then colonizing it, as they did in Eastern Europe.)
Cold wars are historically common events. They are just ways of gaining geopolitical advantage without military battles. In the seventeenth century, Louis XIV fought cold wars with his European neighbors and with the papacy. What made the American Cold War different was not the bomb itself but the idea of the bomb, the bomb as the symbol of ultimate commitment. That idea is what locked the East-West antagonism into place, and raised the stakes in every disagreement. The bomb may have prevented military conflict between the superpowers; it did not prevent the many superpower proxy wars—in Korea, Vietnam, Nicaragua, Afghanistan—in which millions of people died. In the end, the Soviet Union gave up, something that no one had predicted. But today many smaller powers have nuclear weapons, and even in the unlikely event that no leader of one of those nations ever decides to use them, out of fear or anger, there is always the possibility—in the long run, there is the inevitability—of an accident.
3 Responses to New Yorker review of Eric Schlosser’s “Command and Control”