Atomic Age


The Atomic Age, also known as the Atomic Era, is the period of history following the detonation of the first nuclear weapon, The Gadget at the Trinity test in New Mexico, on July 16, 1945, during World War II. Although nuclear chain reactions had been hypothesized in 1933 and the first artificial self-sustaining nuclear chain reaction had taken place in December 1942, the Trinity test and the ensuing bombings of Hiroshima and Nagasaki that ended World War II represented the first large-scale use of nuclear technology and ushered in profound changes in sociopolitical thinking and the course of technology development.
While atomic power was promoted for a time as the epitome of progress and modernity, entering into the nuclear power era also entailed frightful implications of nuclear warfare, the Cold War, mutual assured destruction, nuclear proliferation, the risk of nuclear disaster, as well as beneficial civilian applications in nuclear medicine. It is no easy matter to fully segregate peaceful uses of nuclear technology from military or terrorist uses, which complicated the development of a global nuclear-power export industry right from the outset.
In 1973, concerning a flourishing nuclear power industry, the United States Atomic Energy Commission predicted that, by the turn of the 21st century, one thousand reactors would be producing electricity for homes and businesses across the U.S. However, the "nuclear dream" fell far short of what was promised because nuclear technology produced a range of social problems, from the nuclear arms race to nuclear meltdowns, and the unresolved difficulties of bomb plant cleanup and civilian plant waste disposal and decommissioning. Since 1973, reactor orders declined sharply as electricity demand fell and construction costs rose. Many orders and partially completed plants were cancelled.
By the late 1970s, nuclear power had suffered a remarkable international destabilization, as it was faced with economic difficulties and widespread public opposition, coming to a head with the Three Mile Island accident in 1979, and the Chernobyl disaster in 1986, both of which adversely affected the nuclear power industry for many decades.

Early years

In 1901, Frederick Soddy and Ernest Rutherford discovered that radioactivity was part of the process by which atoms changed from one kind to another, involving the release of energy. Soddy wrote in popular magazines that radioactivity was a potentially “inexhaustible” source of energy, and offered a vision of an atomic future where it would be possible to “transform a desert continent, thaw the frozen poles, and make the whole earth one smiling Garden of Eden.” The promise of an “atomic age,” with nuclear energy as the global, utopian technology for the satisfaction of human needs, has been a recurring theme ever since. But "Soddy also saw that atomic energy could possibly be used to create terrible new weapons".
The concept of a nuclear chain reaction was hypothesized in 1933, shortly after Chadwick's discovery of the neutron. Only a few years later, in December 1938 nuclear fission was discovered by Otto Hahn and his assistant Fritz Strassmann, and explained, proved and explained by Lise Meitner and Otto Frisch. The first artificial self-sustaining nuclear chain reaction took place in December 1942 under the leadership of Enrico Fermi.
In 1945, the pocketbook The Atomic Age heralded the untapped atomic power in everyday objects and depicted a future where fossil fuels would go unused. One science writer, David Dietz, wrote that instead of filling the gas tank of your car two or three times a week, you will travel for a year on a pellet of atomic energy the size of a vitamin pill. Glenn T. Seaborg, who chaired the Atomic Energy Commission, wrote "there will be nuclear powered earth-to-moon shuttles, nuclear powered artificial hearts, plutonium heated swimming pools for SCUBA divers, and much more".

World War II

The phrase "Atomic Age" was coined by William L. Laurence, a New York Times journalist who became the official journalist for the Manhattan Project which developed the first nuclear weapons. He witnessed both the Trinity test and the bombing of Nagasaki and went on to write a series of articles extolling the virtues of the new weapon. His reporting before and after the bombings helped to spur public awareness of the potential of nuclear technology and in part motivated development of the technology in the U.S. and in the Soviet Union. The Soviet Union would go on to test its first nuclear weapon in 1949.
In 1949, U.S. Atomic Energy Commission chairman, David Lilienthal stated that "atomic energy is not simply a search for new energy, but more significantly a beginning of human history in which faith in knowledge can vitalize man's whole life".

1950s

The phrase gained popularity as a feeling of nuclear optimism emerged in the 1950s in which it was believed that all power generators in the future would be atomic in nature. The atomic bomb would render all conventional explosives obsolete and nuclear power plants would do the same for power sources such as coal and oil. There was a general feeling that everything would use a nuclear power source of some sort, in a positive and productive way, from irradiating food to preserve it, to the development of nuclear medicine. There would be an age of peace and plenty in which atomic energy would "provide the power needed to desalinate water for the thirsty, irrigate the deserts for the hungry, and fuel interstellar travel deep into outer space". This use would render the Atomic Age as significant a step in technological progress as the first smelting of Bronze, of Iron, or the commencement of the Industrial Revolution.
This included even cars, leading Ford to display the Ford Nucleon concept car to the public in 1958. There was also the promise of golf balls which could always be found and nuclear-powered aircraft, which the US federal government even spent US$1.5 billion researching. Nuclear policymaking became almost a collective technocratic fantasy, or at least was driven by fantasy:

The very idea of splitting the atom had an almost magical grip on the imaginations of inventors and policymakers. As soon as someone said – in an even mildly credible way – that these things could be done, then people quickly convinced themselves... that they would be done.

In the US, military planners "believed that demonstrating the civilian applications of the atom would also affirm the American system of private enterprise, showcase the expertise of scientists, increase personal living standards, and defend the democratic lifestyle against communism".
Some media reports predicted that thanks to the giant nuclear power stations of the near future electricity would soon become much cheaper and that electricity meters would be removed, because power would be "too cheap to meter."
When the Shippingport reactor went online in 1957 it produced electricity at a cost roughly ten times that of coal-fired generation. Scientists at the AEC's own Brookhaven Laboratory "wrote a 1958 report describing accident scenarios in which 3,000 people would die immediately, with another 40,000 injured".
However Shippingport was an experimental reactor using highly enriched uranium and originally intended for a nuclear-powered aircraft carrier. Kenneth Nichols was a consultant for the Connecticut Yankee and Yankee Rowe nuclear power stations wrote that while considered "experimental" and not expected to be competitive with coal and oil, they "became competitive because of inflation... and the large increase in price of coal and oil." He wrote that for nuclear power stations the capital cost is the major cost factor over the life of the plant, hence "antinukes" try to increase costs and building time with changing regulations and lengthy hearings, so that "it takes almost twice as long to build a atomic power plant in the United States as in France, Japan, Taiwan or South Korea." French pressurised-water nuclear plants produce 60% of their electric power, and have proven to be much cheaper than oil or coal.
Fear of possible atomic attack from the Soviet Union caused U.S. school children to participate in "duck and cover" civil defense drills.

Atomic City

During the 1950s, Las Vegas, Nevada, earned the nickname "Atomic City" for becoming a hotspot where tourists would gather to watch above-ground nuclear weapons tests taking place at Nevada Test Site. Following the detonation of Able, one of the first atomic bombs dropped at the Nevada Test Site, the Las Vegas Chamber of Commerce began advertising the tests as an entertainment spectacle to tourists.
The detonations proved popular and casinos throughout the city capitalised on the tests by advertising hotel rooms or rooftops which offered views of the testing site or by planning "Dawn Bomb Parties" where people would come together to celebrate the detonations. Most parties started at midnight and musicians would perform at the venues until 4.00AM when the party would briefly stop so guests could silently watch the detonation. Some casinos capitalised on the tests further by creating so called "atomic cocktails", a mixture of vodka, cognac, sherry and champagne.
Meanwhile, groups of tourists would drive out into the desert with family or friends to watch the detonations.
Despite the health risks associated with nuclear fallout, tourists and viewers were told to simply "shower". Later on, however, anyone who had worked at the testing site or lived in areas exposed to nuclear fallout fell ill and had higher chances of developing cancer or suffering pre-mature deaths.

1960s

By exploiting the peaceful uses of the "friendly atom" in medical applications, earth removal and, subsequently, in nuclear power plants, the nuclear industry and government sought to allay public fears about nuclear technology and promote the acceptance of nuclear weapons. At the peak of the Atomic Age, the United States government initiated Operation Plowshare, involving "peaceful nuclear explosions". The United States Atomic Energy Commission chairman announced that the Plowshares project was intended to "highlight the peaceful applications of nuclear explosive devices and thereby create a climate of world opinion that is more favorable to weapons development and tests".
Project Plowshare “was named directly from the Bible itself, specifically Micah 4:3, which states that God will beat swords into ploughshares, and spears into pruning hooks, so that no country could lift up weapons against another”. Proposed uses included widening the Panama Canal, constructing a new sea-level waterway through Nicaragua nicknamed the Pan-Atomic Canal, cutting paths through mountainous areas for highways, and connecting inland river systems. Other proposals involved blasting caverns for water, natural gas, and petroleum storage. It was proposed to plant underground atomic bombs to extract shale oil in eastern Utah and western Colorado. Serious consideration was also given to using these explosives for various mining operations. One proposal suggested using nuclear blasts to connect underground aquifers in Arizona. Another plan involved surface blasting on the western slope of California's Sacramento Valley for a water transport project. However, there were many negative impacts from Project Plowshare's 27 nuclear explosions. Consequences included blighted land, relocated communities, tritium-contaminated water, radioactivity, and fallout from debris being hurled high into the atmosphere. These were ignored and downplayed until the program was terminated in 1977, due in large part to public opposition, after $770 million had been spent on the project.
In the Thunderbirds TV series, a set of vehicles was presented that were imagined to be completely nuclear, as shown in cutaways presented in their comic-books.
The term "atomic age" was initially used in a positive, futuristic sense, but by the 1960s the threats posed by nuclear weapons had begun to edge out nuclear power as the dominant motif of the atom.

1970 to 2000

French advocates of nuclear power developed an aesthetic vision of nuclear technology as art to bolster support for the technology. Leclerq compares the nuclear cooling tower to some of the grandest architectural monuments of western culture:

The age in which we live has, for the public, been marked by the nuclear engineer and the gigantic edifices he has created. For builders and visitors alike, nuclear power plants will be considered the cathedrals of the 20th century. Their syncretism mingles the conscious and the unconscious, religious fulfilment and industrial achievement, the limitations of uses of materials and boundless artistic inspiration, utopia come true and the continued search for harmony.

In 1973, the United States Atomic Energy Commission predicted that, by the turn of the 21st century, one thousand reactors would be producing electricity for homes and businesses across the USA. But after 1973, reactor orders declined sharply as electricity demand fell and construction costs rose. Many orders and partially completed plants were cancelled.
Nuclear power has proved controversial since the 1970s. Highly radioactive materials may overheat and escape from the reactor building. Nuclear waste needs to be regularly removed from the reactors and disposed of safely for up to a million years, so that it does not pollute the environment. Recycling of nuclear waste has been discussed, but it creates plutonium which can be used in weapons, and in any case still leaves much unwanted waste to be stored and disposed of. Large, purpose-built facilities for long-term disposal of nuclear waste have been difficult to site, and have not yet reached fruition.
By the late 1970s, nuclear power suffered a remarkable international destabilization, as it was faced with economic difficulties and widespread public opposition, coming to a head with the Three Mile Island accident in 1979, and the Chernobyl disaster in 1986a, both of which adversely affected the nuclear power industry for decades thereafter. A cover story in the February 11, 1985, issue of Forbes magazine commented on the overall management of the nuclear power program in the United States:

The failure of the U.S. nuclear power program ranks as the largest managerial disaster in business history, a disaster on a monumental scale … only the blind, or the biased, can now think that the money has been well spent. It is a defeat for the U.S. consumer and for the competitiveness of U.S. industry, for the utilities that undertook the program and for the private enterprise system that made it possible.

So, in a period just over 30 years, the early dramatic rise of nuclear power went into equally meteoric reverse. With no other energy technology has there been a conjunction of such rapid and revolutionary international emergence, followed so quickly by equally transformative demise.

21st century

In the 21st century, the label of the "Atomic Age" connotes either a sense of nostalgia or naïveté, and is considered by many to have ended with the fall of the Soviet Union in 1991, though the term continues to be used by many historians to describe the era following the conclusion of the Second World War. Atomic energy and weapons continue to have a strong effect on world politics in the 21st century. The term is used by some science fiction fans to describe not only the era following the conclusion of the Second World War but also contemporary history up to the present day.
The nuclear power industry has improved the safety and performance of reactors, and has proposed new safer reactor designs but there is no guarantee that the reactors will be designed, built and operated correctly. Mistakes do occur and the designers of reactors at Fukushima in Japan did not anticipate that a tsunami generated by an earthquake would disable the backup systems that were supposed to stabilize the reactor after the earthquake. According to UBS AG, the Fukushima I nuclear accidents have cast doubt on whether even an advanced economy like Japan can master nuclear safety. Catastrophic scenarios involving terrorist attacks are also conceivable. An interdisciplinary team from MIT has estimated that if nuclear power use tripled from 2005–2055, at least four serious nuclear accidents would be expected in that period.
In September 2012, Japan announced that it would completely phase out nuclear power by 2030, although under the Abe administration this is now unlikely, with Germany, and other countries in reaction to the accident at Fukushima. Germany plans to completely phase-out nuclear energy by 2022.

Chronology

A large anti-nuclear demonstration was held on May 6, 1979, in Washington D.C., when 125,000 people including the Governor of California, attended a march and rally against nuclear power. In New York City on September 23, 1979, almost 200,000 people attended a protest against nuclear power. Anti-nuclear power protests preceded the shutdown of the Shoreham, Yankee Rowe, Millstone I, Rancho Seco, Maine Yankee, and about a dozen other nuclear power plants.
On June 12, 1982, one million people demonstrated in New York City's Central Park against nuclear weapons and for an end to the cold war arms race. It was the largest anti-nuclear protest and the largest political demonstration in American history. International Day of Nuclear Disarmament protests were held on June 20, 1983 at 50 sites across the United States.
In 1986, hundreds of people walked from Los Angeles to Washington, D.C. in the Great Peace March for Global Nuclear Disarmament. There were many Nevada Desert Experience protests and peace camps at the Nevada Test Site during the 1980s and 1990s.
On May 1, 2005, 40,000 anti-nuclear/anti-war protesters marched past the United Nations in New York, 60 years after the atomic bombings of Hiroshima and Nagasaki. This was the largest anti-nuclear rally in the U.S. for several decades.

Discovery and development