A mushroom cloud is one of the most terrifying images imaginable for billions of people around the world who still live under the specter of a nuclear war, but for many others, it was the fulfillment of one of the most consequential research efforts in human history: the Manhattan Project.
Under the codename Manhattan Project, the US effort in World War II to beat Nazi Germany to an atomic weapon had a complicated and unquestionably terrible legacy. This is especially true for the people of Japan, the only nation that has ever suffered a nuclear attack.
While many notable scientists involved in the Manhattan Project would experience anguish, guilt, and horror over the consequences of their work, others saw the atomic bomb — and the doctrine of Mutually Assured Destruction (MAD) that accompanied it — as the only way to secure lasting peace in the world.
No matter one's ultimate position on the issue of the atomic bombs that ended the Second World War, there is no question that few research projects have proven as consequential for human history as the Manhattan Project.
Early Atomic Theory
The idea of an indivisible unit of matter is an ancient idea going back to some of the earliest Greek and Indian texts on record, but some of the first to describe the concept of the atom were the ancient Greek philosophers Leucippus and Democritus, in the fifth century BCE.
The name atom itself comes from the ancient Greek word atomos, which roughly means "uncuttable", and for about three millennia, the indivisibility of the atom was more of a philosophical position than a scientific one.
In the Middle Ages, questions about the true nature of matter straddled the line between practical sciences and the practice of alchemy. European and Islamic scientists and philosophers picked up where Greek philosophers left off. Sometime around 1000 AD, bismuth was discovered by Persian alchemist Jabir ibn Hayyan and in 1669, German alchemist Hennig Brand discovered phosphorus while trying to create the philosopher's stone, a legendary substance that was believed to turn metal into gold.
Even such scientific luminaries as Sir Isaac Newton could not help but dabble in alchemy, but it wasn't until the 18th and 19th centuries that atomic theory as we know it today was put on a more solid scientific footing.
After scientists like the 18th-century French chemist Antoine-Laurent Lavoisier began to isolate and identify individual elements like oxygen, John Dalton (1766-1844) developed his atomic theory.
Dalton claimed that atoms of different elements vary in size and mass, which refuted the long-held notions that atoms of all kinds of matter are alike. Dalton formulated a method of measuring the masses of various elements according to the way they combined with fixed masses of each other.
Nuclear physics was born when Ernest Rutherford split the atom in 1917. Then, one of the most consequential discoveries in physics came in December 1938 when physicists Lise Meitner, Otto Hahn, Fritz Strassmann, and Otto Frisch worked out that bombarding a uranium atom with neutrons broke the atom into two smaller components.
These elements combined had only four-fifths the atomic weight of the original uranium atom, so using Einstein's equation E = mc2, Meitner and Frisch were able to show that the missing fifth was released as energy (200 MeV, to be exact). The physicists had discovered nuclear fission, just as the shadow of war descended over Europe in 1939.
The Outbreak of World War II
After the discovery of nuclear fission, it was quickly understood that nuclear fission in a large amount of uranium could produce a cascading chain reaction that would release an incredible amount of energy.
The military applications of this were obvious, and it wasn't long before Germany began its Uranverein — or "uranium club". The program began in April 1939, five months before the Nazis invaded Poland and touched off the Second World War.
Once the war began, physicists fleeing Europe immediately feared what the Nazis would do if they were the first to create an atomic bomb. Among them was Albert Einstein, who signed onto a letter sent to President Franklin D. Roosevelt in August 1939 to alert the US government of Nazi Germany's suspected nuclear program, and to plead with the US to begin securing uranium stockpiles around the world.
The letter also implored the government to invest in the research of nuclear fission chain reactions being carried out by Enrico Fermi in the United States.
Roosevelt ordered government officials to follow up on the letter and judge the feasibility of developing weapons based on nuclear fission chain reactions. They came back to affirm that not only was it theoretically possible but that such weapons would be by far the most powerful weapons humanity had ever seen.
Beginning the Manhattan Project
Considering the stakes, the initial investment into what would be known as the Manhattan Project — the US Army office that worked on the project was originally located in Manhattan, NY — was pretty modest.
The US Navy granted Columbia University $6,000 to fund Fermi's research into fission chain reactions. Fermi, along with Leo Szilard, Eugene Booth, John Dunning, and others, used the funding to purchase graphite and subsequently produced the first fission reaction in the US.
Soon, the US government began to increase funding into uranium and plutonium fission reactions, but still at a relatively small scale.
In the UK, Frisch and Rudolf Peierls identified the amount of uranium required to produce a fission chain reaction, a "critical mass" of about 20 lbs (9.07 kg). This made the amount of material required small enough to fit on a conventional bomber, so when they presented this data to the UK government, it approved its own atomic weapons program.
By late 1941, the UK was further along in its atomic weapons research than the US, so the two countries began exchanging atomic weapons research to help accelerate their development even before the US formally entered the war.
Roosevelt put the US Army in charge of the program, due to that branch's experience with large-scale research efforts, and once the US entered the war in December 1941 after the bombing of Pearl Harbor, investment in the Manhattan Project began in earnest.
US officials tapped University of California theoretical physicist J. Robert Oppenheimer to head up "Project Y", the part of the Manhattan Project that would actually design and build the atomic bomb itself. Oppenheimer had no experience leading the kind of laboratory that the project would need, but many of the other candidates being considered were working on other projects where they couldn't be spared.
He also had some personal connections to known communists, including both his wife and suspected mistress, which raised security concerns. These were assuaged, however, after interviews with Oppenheimer, and he was given the lead on developing the US atomic bomb. (In 1954, Oppenheimer was called before a tribunal of the Atomic Energy Commission for a hearing on his past involvement with communist organizations and the possibility that he was a Soviet spy. As a result of evidence given, his security clearance was revoked, provoking outrage.)
The scale of the project was truly enormous, involving over a dozen sites around the US with material — especially Uranium — sourced from both Canada and the Belgian Congo.
Oppenheimer suggested a site near the Los Alamos Ranch, next to the Sangre de Cristo mountains in New Mexico, for the laboratory for Project Y. The site chosen would come to be called the Los Alamos National Laboratory, though it was listed as "Site Y" or "The Hill" in official documents, given the top-secret nature of the project.
Sites Contributing to the Work at Los Alamos Laboratory
For secrecy and practicality's sake, the Manhattan Project was distributed across several different locations across the United States and Canada.
One of the most important of these sites was the Clinton Engineer Works in Oak Ridge, Tennessee. Here, the uranium isotope needed to make the first atomic bomb was isolated and extracted into a purer mass of material.
The challenge is that the uranium mined in Canada and the Belgian Congo was about 99.3% uranium-238, and 0.7% uranium-235. Only uranium-235 was used as fissile material, so it needed to be separated or "enriched" to produce a critical mass for a bomb.
Enriched Uranium-235 was produced at two separate plants at Oak Ridge, each of which used a separate method for obtaining the U-235, and delivered to the Los Alamos Laboratory by July 1945. Almost all the U-235 produced at Oak Ridge, 64 kg, was used to create the Little Boy atomic bomb.
Part of the strategy of the Manhattan Project was not to invest everything into uranium fission. Plutonium-239, which had only just been discovered in late 1940-early 1941 at the University of California, Berkeley, was also developed as a fissile material for an atomic bomb.
It has a much lower critical mass than uranium-235, but it isn't easily found in nature. Instead, the easiest way to produce plutonium-239 was to bombard uranium-238 with deuterons, an isotope of hydrogen.
The uranium would absorb neutrons in the bombardment and produce neptunium-239, which would decay into plutonium-239.
The easiest way to do this is in nuclear reactors using uranium-235 as fuel rods. During the Manhattan Project, several such reactors were built to produce the plutonium needed to construct an atomic bomb, including the X-10 graphite reactor at the Clinton Engineer Works in Oak Ridge and at the Hanford Engineer Works in Washington state.
Since plutonium-239 reaches a critical mass at about 24 lbs — which roughly forms a sphere with a four-inch diameter (10.6 cm) — it doesn't need as much material to create the chain fission reaction needed for a bomb. The trouble was producing and isolating enough of the isotope from the reactors to reach the required mass.
Uranium can be extracted from an ore that was mined, but you had to effectively spend uranium to get plutonium, so there was a careful balance that had to be struck to ensure there was enough of both fissile materials for the Manhattan Project's purposes.
In total, enough plutonium was delivered to Los Alamos by July 1945 to produce two atomic bombs. In total, the Manhattan Project produced enough fissile material by 1945 for just three bombs, and no one was even sure if any of them would work.
Designing the World's Most Powerful Weapon
There were two different designs used to produce an atomic bomb.
The first was what is known as a gun-type bomb. This method involves "shooting" a slug of sub-critical fissile material into a target sub-critical mass of the same fissile material, using a conventional chemical charge as a propellant.
When the fissile slug hits the target mass, it shoots a burst of neutrons into it to initiate a supercritical fission reaction that would grow exponentially through both slug and target masses.
The key challenge with this type of device is to achieve supercriticality before the bomb destroys itself. This means that the speed and mass of the slug need to be enough to initiate the chain reaction quickly but not so big or fast that it destroys the two masses before they have sufficient time to undergo supercritical fission reactions.
This type of bomb is easier to produce, however, so it was the first type of bomb planned at Los Alamos in 1943.
Thin Man, as the bomb was called, used plutonium as its reactant. However, after Los Alamos Laboratory received its first batch of plutonium-239 from Oak Ridge in 1944, it was found that there was too much plutonium-240 in the mass to be used in a gun-type bomb.
In a gun-type bomb, the plutonium-240 would react too fast and it would blow apart the bomb before the rest of the material had achieved a sufficient critical reaction to trigger the desired explosion.
In order to use the plutonium material, another bomb design had to be developed.
A Los Alamos physicist, Seth Neddermeyer, had already been working on plans for an implosion-type bomb in 1944. Implosion uses spherically-arranged explosives to compress a subcritical mass of fissile material into denser mass.
As the atoms of the subcritical mass squeeze together, neutron capture happens much quicker and quickly initiates the supercritical fission chain reaction.
The problem with an implosion-type bomb is that it is a much more sophisticated design from a theoretical and practical perspective, which made it much harder to actually implement.
This design, however, had the advantage of largely negating the problem of premature plutonium-240 detonation, since the explosive force that isotope produces is being counteracted by the implosive force that triggered its reaction in the first place.
In late 1943, John von Neumann, a mathematician and computer scientist at Los Alamos Laboratory improved on Neddermeyer's implosion design by refining the calculations needed to generate the right implosive detonation and also introduced a spherical design for the plutonium core, rather than a cylinder as Neddermeyer's design called for.
The spherical design greatly reduced the chance of premature detonations while also increasing the efficiency of the fissile chain reaction by keeping it tightly packed for a longer duration. This increased the number of fission reactions achieved in the plutonium before the energy being released blew the bomb apart.
Satisfied that implosion was the better of the two designs, Oppenheimer redirected Los Alamos' efforts toward building a new bomb called Fat Man, which would use the plutonium they already had to produce the first atomic bomb.
Von Neumann's design used explosive "lenses", which are shaped charges that control the geometry of the shockwave passing through them.
It took several months to refine the design of the implosion mechanism. The ultimate design used both fast and slow explosives, wrapped in a truncated icosahedron (a soccer ball), with 20 hexagonal and 12 pentagonal lenses. Each lens weighed about 80 lbs (36 kg).
The implosion also needed to be perfectly timed in order to trigger the plutonium fission reaction, and an entirely new kind of detonator had to be invented for the purpose.
Designing the casing for the plutonium reactant was also a long, laborious process. An aluminum pusher and a natural uranium tamper were designed to compress and hold the triggered plutonium core together as long as possible while reflecting neutrons back into the plutonium to trigger more fission reactions.
Metallurgists also had to figure out how to properly cast the plutonium into a tight, even sphere. Plutonium is brittle at room temperature, but after some trial and error, it was found that it was malleable between 573 to 842 degrees Fahrenheit (300 to 450 degrees Celsius).
In order to stabilize the plutonium, it was alloyed with gallium, heated, and pressed into a spherical shape. It was then coated in nickel to help prevent corrosion.
The first plutonium sphere was produced and delivered to the lab on July 2, 1945, with another delivered about three weeks later.
The Trinity Test: July 16, 1945
By the middle of July 1945, the time had come to finally test the results of years of intense research and effort.
On July 16, a device nicknamed "gadget" was placed on top of a 100-foot-tall (30.48) steel tower in the bombing range of the Alamogordo Army Airfield. It was placed on a tower to simulate it being dropped from a bomber, since an air detonation maximized the direct damage done to the target and reduced the amount of fallout generated by the blast.
The test, codenamed Trinity, was approved by the Army on the condition that the bomb was placed in a container, named Jumbo, to recover any active material in the event of a failure. They needn't have bothered.
At 5:29:45 AM local time on July 16, 1945, "gadget" exploded with the force of 20 kilotons of TNT, producing a mushroom cloud more than seven miles tall and a shockwave felt over 100 miles away (160 km). It left behind a 250-foot (76 m) wide crater of radioactive glass, known now as trinitite, and the sound of the detonation was heard as far away as Texas.
Famously, Oppenheimer — who was on-site to observe the test — recalled thinking of the lines from the Hindu scriptures, the Bhagavad Gita (XI,12): "If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one..."
We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, "Now I am become Death, the destroyer of worlds." I suppose we all thought that, one way or another.
Legacy of the Manhattan Project
Less than a month later, on August 6, 1945, Little Boy became the first nuclear weapon ever used in combat. The gun-type bomb, packed with all the uranium-235 the Manhattan Project could produce up until that point, was dropped on the Japanese city of Hiroshima.
Between the initial blast and the resulting burn and radiation injuries suffered by those not immediately killed by the explosion, an estimated 90,000 to 146,000 people died, many of whom were civilians.
Three days later, Fat Man, an implosion-type bomb packed with the Manhattan Project's remaining plutonium, exploded over Nagasaki, killing between 39,000 and 80,000; again, mostly civilians.
Japan surrendered less than a week later, and World War II came to an abrupt, horrifying end.
Among those who worked on the Manhattan Project, several went on to become prominent anti-proliferation activists, like Joseph Rotblat. Others, like von Neumann, continued to advance the development of nuclear weapons as a bulwark against the even greater horrors of another world war (as they saw it).
Regardless of where one stands, there is no denying the significance of the Manhattan Project in human history, but its legacy is still unknowable. We still live under the shadow of the mushroom cloud, and its legacy can only be written by those who manage to one day escape its reach.