Nuclear energy has had an interesting history, mostly due to the fact that its core technology is inherently dangerous. Although it is still a relatively new energy source in the grand scheme of things, its origins actually date back to the late 1800s.
Let's explore the history of nuclear energy in a little more depth to follow its progress.
The beginnings of nuclear energy
The story of nuclear energy really begins in 1895, when Wilhelm Roentgen discovered x-rays.
While experimenting with a cathode ray tube, Roentgen noticed that photographic plates sitting nearby lit up when the device was on, even when it was covered in black paper, drawing him to conclude that the cathode tube was emitting an invisible ray, something that hadn't been observed before.
What Roentgen noticed was actually x-rays propagating from the tube.
The following year, in France, a man named Becquerel discovered that uranium salts could produce penetrating radiation on their own, without any need for excitation by an external energy source.
This observation led Becquerel to the realization that the uranium must be producing x-rays.
Marie and Pierre Curie also studied the phenomenon, leading them to isolate two new elements, Polonium and Radium. Their investigation led them, in 1898, to coin a new word, radioactivity.
While scientist Ernest Rutherford was studying radioactivity in England, he discovered two new types of radiation, which differed from x-rays, and which he called alpha and beta radiation.
One of the most pivotal discoveries for the future of nuclear energy was also made by Rutherford. In 1909, he discovered that the majority of the mass of an atom was contained in their nucleus.
Rutherford is today considered the father of nuclear physics. He went on to discover gamma radiation, and even theorized the existence of neutrons in 1920, despite having absolutely no evidence of their existence. Neutrons would eventually be discovered in 1932.
These foundational discoveries formed the basis for what would grow into the industry of nuclear energy production.
The splitting of atoms
In 1938, German scientists Otto Hann and Fritz Strassman shot neutrons at uranium atoms and discovered that a significant amount of energy was being released. With the help of Lise Meitner and Otto Frisch, they were able to explain that what they had observed was the splitting of the atom through fission.
By 1939, physicists Leo Szilard and Enrico Fermi theorized that fission reactions could be used to create an explosion through a massive chain reaction.
Szilard and a few other scientists, including Albert Einstein, wrote to President Roosevelt in 1939 to warn him about the possibility of creating nuclear weapons. The President authorized an advisory committee to begin developing atomic bombs for the US.
By 1942, Fermi, working as part of the committee, was able to create the first man-made fission chain reaction in Chicago. It was at this point that the Manhattan project swung into full development.
The team pursued the development of two types of bombs, one using uranium as a core, and one plutonium. The project was highly secretive, and entire covert cities were built to support the project. One facility, in Oak Ridge, Tennessee, used nuclear reactions to create plutonium to be used for producing enriched uranium. Another facility in Washington used nuclear reactions to produce plutonium.
The now-famous secret site in Los Alamos, New Mexico, was used by hundreds of scientists for the research and construction of nuclear weapons.
The end of WWII, in 1945, saw the first use of nuclear weapons on people. This was also the moment when the majority of the world's population, realized just how destructive this technology could be.
Reactors being used as power sources
It was 1951 before the first nuclear reactor which produced electricity was completed. Called Experimental Breeder Reactor 1, it was based in Idaho and was cooled using liquid-metal.
In 1954, the first nuclear-powered submarine, the USS Nautilus, was completed, allowing the submarine to stay submerged for significant portions of time without refueling.
In the same year, the Soviets completed their first nuclear power plant. the Obninsk Nuclear Power Plant, the first grid-connected nuclear reactor. The Shippingport Atomic Power Station, in Pennsylvania, came online in 1957 and was the world's first full-scale atomic electric power plant devoted exclusively to peacetime use.
The 1960s and 70s brought the development and construction of many more commercial nuclear reactors for electricity generation, many of which worked off of slightly modified designers from previous reactors.
These nuclear power plants were touted as relatively cheap and emission-free sources of electricity. Nuclear power was seen by many at this time as holding the promise of being the power source of the future.
In 1974, France made a big push for the development of nuclear energy, eventually generating as much as 75% of its power through nuclear reactors. During the same time period, around 20% of the energy generation in the United States came from nuclear energy, produced by 104 plants across the country.
However, in 1979, the future of nuclear power was thrown into question with the accident at Three Mile Island. This partial meltdown of a reactor in Pennsylvania began the shift in public opinion on the safety of nuclear reactors.
When the Chernobyl disaster occurred in 1986, releasing a vast cloud of radiation that affected much of northern Europe, and as far as the east coast of the United States, global opinion began to shift away from nuclear power. Although, these disasters did lead to the creation of safer reactor designs.
One interesting nuclear energy history fact is that in 1994, Russia and the US agreed to downgrade their nuclear warheads into nuclear fuel. Around 10% of US nuclear electricity today is produced using dismantled nuclear weapons.
The nuclear energy sector in the post-Chernobyl era of the late 90s and 2000s was marked by a high degree of safety in plant operations and no US deaths. The general opinion of nuclear power began to shift back into the positive as the industry demonstrated continued safety.
However, the Fukushima disaster, in 2011, in which an earthquake and tsunami led to a partial meltdown and the release of a large amount of radiation from a Japanese reactor, served as a reminder that nuclear power is not completely safe.
Around 14 percent of global energy is still produced through nuclear power plants today, and some estimate that nuclear energy may have saved 1.8 million lives over the course of its history, by offsetting air pollution from the use of fossil fuels.