With support from the University of Richmond

History News Network

History News Network puts current events into historical perspective. Subscribe to our newsletter for new perspectives on the ways history continues to resonate in the present. Explore our archive of thousands of original op-eds and curated stories from around the web. Join us to learn more about the past, now.

On the Eve of Destruction: Breaking the Double-Bind of the Nuclear Arms Race

EVER SINCE THE DEVELOPMENT OF THE ATOMIC BOMB in the mid-1940s, the governing consensus among American liberals and conservatives alike has been that nuclear weapons are an essential part of military preparedness. Whether the Republican or the Democratic Party held power in Washington, the underlying assumption remained the same: a “balance of terror” was the only way to ensure peace. From Presidents Truman and Eisenhower through Reagan and Obama, there was always a bipartisan Congressional majority in support of billions of dollars for nuclear weaponry, without the usual “how are you going to pay for it?” posturing.

Yet the physicists who built or supported the building of the first nuclear weapons believed that the bomb changed everything, including the very nature of war. The great Danish physicist Niels Bohr put it most succinctly: “We are in a completely new situation,” he told a friend in 1948, “that cannot be resolved by war.” Bohr tried to convince President Franklin D. Roosevelt and British Prime Minister Winston Churchill of that millennial change in 1944, a year before the bombs were ready, hoping the two leaders would share the secret of its ongoing development with Joseph Stalin and negotiate together for its postwar control. Churchill was under the strain at that time of preparing for the Normandy invasion and was aware that the bomb would be a fundamental measure of national prestige, something he believed his nearly bankrupt nation would need. He dismissed Bohr impatiently: “After all,” he quarreled, “this new bomb is just going to be bigger than our present bombs; it involves no difference in the principles of war.”

The destructive change of scale was not obvious at the outset. The first bombs were small—mere tactical weapons, battlefield weapons—by today’s standards: fifteen thousand tons of TNT equivalent, or fifteen kilotons, for Hiroshima; around twenty-two kilotons for Nagasaki. The American physicist Philip Morrison, assigned as part of a Red Cross team to assess the damage of those first atomic bombings, hitched a ride out of Tokyo in a general’s plane in September 1945 and saw that the firebombed cities of Nagoya, Osaka, and Kobe “looked checkered,” with patches of destruction intermingled with undamaged areas, according to an account in The New Yorker in 1946. When they flew down the green length of that narrow mountainous archipelago to Hiroshima they saw “one enormous, flat, red-rust scar” with no roofs or vegetation left. High-explosive firebombing by hundreds of B-29s, or one B-29 carrying one atomic bomb—all the bombed cities were full of rubble and ashes. Morrison left Hiroshima believing that because the atomic bomb exploded high in the air there “had been a minimum of radioactivity in the city.” The U.S. diplomat Paul Nitze, part of the team preparing the postwar United States Strategic Bombing Survey, viewed the same damage Morrison saw and concluded that atomic bombs were not the decisive weapons the scientists had claimed. “After all, this new bomb is just going to be bigger than our present bombs.” But not so much bigger as to change everything—not yet.

After the war, the scientists went one way, the government another. The scientists campaigned for international control, the government for international control premised on an implicit U.S. nuclear monopoly with a UN military force as added insurance. And for a few short years there was something like a strategic balance in the world. The Soviet Union had several million troops on the ground in Eastern Europe and chose to keep them there; the United States had atomic bombs. But the Soviets were unwilling to accept a world where only the United States knew how to make nuclear weapons, and in August 1949, they tested their first bomb. Now they had several million troops on the ground in Europe and the atomic bomb.

Washington panicked. Almost immediately, the debate among government, scientific, and military leaders and advisers concentrated on one question: whether the United States should rush to build a hydrogen bomb, a weapon that would be several orders of magnitude more powerful and more destructive than the U.S. stockpile at that time of around 170 atomic bombs. The scientific advisers, led by the charismatic American theoretical physicist Robert Oppenheimer, recommended against chasing after a weapon they did not yet know how to make, one requiring for its fuel a rare form of hydrogen—tritium—that would have to be bred in a nuclear reactor, with each kilogram of tritium precursor taking up space in the reactor that could otherwise breed enough plutonium for seventy-five atomic bombs. And of course, in the words of the advisory committee’s majority report, there was the brutal indiscretion of the hydrogen bomb, the use of which “carries much further than the atomic bomb itself the policy of exterminating civilian populations.” In a minority report annex, two members of the committee, Nobel laureate physicists Isidor Rabi and Enrico Fermi, went further:

Necessarily such a weapon goes far beyond any military objective and enters the range of very great natural catastrophes. By its very nature it cannot be confined to a military objective but becomes a weapon which in practical effect is almost one of genocide. . . . It is necessarily an evil thing considered in any light.

Read entire article at The Baffler