The Untold Story of the World's Biggest Nuclear BombRoundup
tags: nuclear weapons, atomic bomb, Hydrogen Bomb, arms control, Nuclear Testing, Edward Teller
Alex Wellerstein is an Associate Professor and Director of the Science and Technology Studies program at the Stevens Institute of Technology. His first book, Restricted Data: The History of Nuclear Secrecy in the United States, was published by the University of Chicago Press in April 2021.
Even before the first atomic bomb was built, scientists in the United States had conceived of an even larger weapon, the “Super,” which would use the energy of a fission bomb to power nuclear fusion reactions in the heavy hydrogen isotopes deuterium and tritium—resulting in a much more powerful weapon than one fueled by fission alone. Such a weapon, they reasoned, could be scaled up to the megaton range, a thousand-fold increase over the kiloton weapons they were contemplating for World War II. Los Alamos researchers were doing calculations on theoretical fission-ignited fusion bombs with yields of 100 megatons by October 1944.
But making the first hydrogen bombs took a bit more time than that. Post-war attempts to rein in the arms race failed, and the Soviet Union detonated its first atomic bomb in 1949. By the end of that year, a tense debate over whether a crash H-bomb program was the proper response to the loss of the American nuclear monopoly had leaked into the public, giving rise to speculation about the vast damage that could be caused by still-hypothetical megaton weapons. It was easy to apply scaling laws to see what the damage would be from such weapons. The 20-kiloton “Fat Man” bomb used against Nagasaki, for example, could devastate the downtown area of a large American city like San Francisco, Los Angeles, or New York. A single 10-megaton bomb, though, could destroy entire metro areas, subjecting over a thousand square miles to a crushing blast wave and searing heat, easily producing casualties in the millions. The radioactivity produced would also be multiplied many hundreds of times, creating the possibility of vast contamination.
By the spring of 1951, Edward Teller and Stanislaw Ulam at Los Alamos had developed their design for a workable hydrogen bomb. The idea was superficially simple: Use the radiation of an exploding fission bomb (the “primary”) to compress a special capsule that contained both fusionable and fissionable materials (the “secondary”). A proof-of-concept device (“Sausage”) was tested in November 1952, achieving an explosive yield of 10 megatons. A more compact, weaponized version (“Shrimp”) was detonated in March 1954 in the Castle Bravo test, achieving a much higher yield than anticipated (15 megatons, or 1,000 times as powerful as the bomb dropped on Hiroshima) and surprising the scientists with more radioactive fallout than expected (which required the evacuation of occupied atolls downwind from the Marshall Islands test site).
Only a few months later, in July 1954, Teller made it clear he thought 15 megatons was child’s play. At a secret meeting of the General Advisory Committee of the Atomic Energy Commission, Teller broached, as he put it, “the possibility of much bigger bangs.” At his Livermore laboratory, he reported, they were working on two new weapon designs, dubbed Gnomon and Sundial. Gnomon would be 1,000 megatons and would be used like a “primary” to set off Sundial, which would be 10,000 megatons. Most of Teller’s testimony remains classified to this day, but other scientists at the meeting recorded, after Teller had left, that they were “shocked” by his proposal. “It would contaminate the Earth,” one suggested. Physicist I. I. Rabi, by then an experienced Teller skeptic, suggested it was probably just an “advertising stunt. But he was wrong; Livermore would for several years continue working on Gnomon, at least, and had even planned to test a prototype for the device in Operation Redwing in 1956 (but the test never took place).
All of which is to say that the idea of making hydrogen bombs in the hundreds-of-megatons yield range was hardly unusual in the late 1950s. If anything, it was tame compared to the gigaton ambitions of one of the H-bomb’s inventors. It is hard to convey the damage of a gigaton bomb, because at such yields many traditional scaling laws do not work (the bomb blows a hole in the atmosphere, essentially). However, a study from 1963 suggested that, if detonated 28 miles (45 kilometers) above the surface of the Earth, a 10,000-megaton weapon could set fires over an area 500 miles (800 kilometers) in diameter. Which is to say, an area about the size of France.
comments powered by Disqus
- The Enduring Appeal of the BBC's "Desert Island Discs" – the Longest Running Interview Show
- White Conservative Parents Got an Educator Fired, then Chased Her to Her Next Job
- Teaching Black History in Virginia Just Got Tougher
- If Ending Roe Isn't Enough, SCOTUS May Blow Up the Regulatory State
- "All the President's Men": From Misguided Buddy Flick to Iconic Political Thriller
- Belew to Maddow: Fascist Groups are "Nationwide Paramilitary Army"
- Far Right Extremism, Paramilitarization, and Misogyny – Statement of Alexandra Stern to the January 6 Committee
- Northwestern Prof and Evanston HS Teachers Engage Illinois Black History
- Jamie Martin: The Rotten Roots of the IMF and World Bank
- Review: Gary Gerstle Argues the Pandemic Killed the Neoliberal Era (But Democrats Don't Know It Yet)