Monday 1 August 2011

Nuclear : How did it all start?


The pursuit of nuclear energy for electricity generation began soon after the discovery in the early 20th century that radioactive elements, such as radium, released immense amounts of energy, according to the principle of mass–energy equivalence. However, means of harnessing such energy was impractical, because intensely radioactive elements were, by their very nature, short-lived (high energy release is correlated with short half-lives). However, the dream of harnessing "atomic energy" was quite strong, even it was dismissed by such fathers of nuclear physics like Ernest Rutherford as "moonshine." This situation, however, changed in the late 1930s, with the discovery of nuclear fission.
In 1932, James Chadwick discovered the neutron, which was immediately recognized as a potential tool for nuclear experimentation because of its lack of an electric charge. Experimentation with bombardment of materials with neutrons led Frédéric and Irène Joliot-Curie to discover induced radioactivity in 1934, which allowed the creation of radium-like elements at much less the price of natural radium. Further work by Enrico Fermi in the 1930s focused on using slow neutrons to increase the effectiveness of induced radioactivity. Experiments bombarding uranium with neutrons led Fermi to believe he had created a new, transuranic element, which he dubbed hesperium.
But in 1938, German chemists Otto Hahn and Fritz Strassmann, along with Austrian physicist Lise Meitner and Meitner's nephew, Otto Robert Frisch, conducted experiments with the products of neutron-bombarded uranium, as a means of further investigating Fermi's claims. They determined that the relatively tiny neutron split the nucleus of the massive uranium atoms into two roughly equal pieces, contradicting Fermi. This was an extremely surprising result: all other forms of nuclear decay involved only small changes to the mass of the nucleus, whereas this process—dubbed "fission" as a reference to biology—involved a complete rupture of the nucleus. Numerous scientists, including Leó Szilárd, who was one of the first, recognized that if fission reactions released additional neutrons, a self-sustaining nuclear chain reaction could result. Once this was experimentally confirmed and announced by Frédéric Joliot-Curie in 1939, scientists in many countries (including the United States, the United Kingdom, France, Germany, and the Soviet Union) petitioned their governments for support of nuclear fission research, just on the cusp of World War II.
In the United States, where Fermi and Szilárd had both emigrated, this led to the creation of the first man-made reactor, known as Chicago Pile-1, which achieved criticality on December 2, 1942. This work became part of the Manhattan Project, which made enriched uranium and built large reactors to breed plutonium for use in the first nuclear weapons, which were used on the cities of Hiroshima and Nagasaki.
After World War II, the prospects of using "atomic energy" for good, rather than simply for war, were greatly advocated as a reason not to keep all nuclear research controlled by military organizations. However, most scientists agreed that civilian nuclear power would take at least a decade to master, and the fact that nuclear reactors also produced weapons-usable plutonium created a situation in which most national governments (such as those in the United States, the United Kingdom, Canada, and the USSR) attempted to keep reactor research under strict government control and classification. In the United States, reactor research was conducted by the U.S. Atomic Energy Commission, primarily at Oak Ridge, Tennessee, Hanford Site, and Argonne National Laboratory.
Work in the United States, United Kingdom, Canada, and USSR proceeded over the course of the late 1940s and early 1950s. Electricity was generated for the first time by a nuclear reactor on December 20, 1951, at the EBR-I experimental station near Arco, Idaho, which initially produced about 100 kW. Work was also strongly researched in the US on nuclear marine propulsion, with a test reactor being developed by 1953 (eventually, the USS Nautilus, the first nuclear-powered submarine, would launch in 1955). In 1953, US President Dwight Eisenhower gave his "Atoms for Peace" speech at the United Nations, emphasizing the need to develop "peaceful" uses of nuclear power quickly. This was followed by the 1954 Amendments to the Atomic Energy Act which allowed rapid declassification of U.S. reactor technology and encouraged development by the private sector.

No comments:

Post a Comment