Human extinction
In futures studies, human extinction is the hypothetical complete end of the human species. This may result either from natural causes or due to anthropogenic causes, but the risks of extinction through natural disaster, such as a meteorite impact or large-scale volcanism, are generally considered to be comparatively low.
Many possible scenarios of anthropogenic extinction have been proposed, such as climate change, global nuclear annihilation, biological warfare and ecological collapse. Some scenarios center on emerging technologies, such as advanced artificial intelligence, biotechnology, or self-replicating nanobots. The probability of anthropogenic human extinction within the next hundred years is the topic of an active debate.
Moral arguments regarding existential risk
"Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress. Multiple scholars have argued based on the size of the "cosmic endowment" that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value. Some of the arguments run as follows:- Carl Sagan wrote in 1983: "If we are required to calibrate extinction in numerical terms, I would be sure to include the number of people in future generations who would not be born...., the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss—including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise."
- Philosopher Derek Parfit in 1984 makes an anthropocentric utilitarian argument that, because all human lives have roughly equal intrinsic value no matter where in time or space they are born, the large number of lives potentially saved in the future should be multiplied by the percentage chance that an action will save them, yielding a large net benefit for even tiny reductions in existential risk.
- Humanity has a 95% probability of being extinct in 7,800,000 years, according to J. Richard Gott's formulation of the controversial Doomsday argument, which argues that we have probably already lived through half the duration of human history.
- Philosopher Robert Adams in 1989 rejects Parfit's "impersonal" views, but speaks instead of a moral imperative for loyalty and commitment to "the future of humanity as a vast project... The aspiration for a better society- more just, more rewarding, and more peaceful... our interest in the lives of our children and grandchildren, and the hopes that they will be able, in turn, to have the lives of their children and grandchildren as projects."
- Philosopher Nick Bostrom argues in 2013 that preference-satisfactionist, democratic, custodial, and intuitionist arguments all converge on the common-sense view that preventing existential risk is a high moral priority, even if the exact "degree of badness" of human extinction varies between these philosophies.
Proposed scenarios
Severe forms of known or recorded disasters
- A wide spread common belief is, that climate change could result in human extinction. Carl Sagan and others have raised the prospect of extreme runaway global warming turning Earth into an uninhabitable Venus-like planet. Some scholars argue that much of the world would become uninhabitable under severe global warming, but even these scholars do not tend to argue that it would lead to complete human extinction, according to Kelsey Piper of Vox. All the IPCC scenarios, including the most pessimistic ones, predict temperatures compatible with human survival. The question of human extinction under "unlikely" outlier models is not generally addressed by the scientific literature. Factcheck.org judges that climate change fails to pose an established 'existential risk', stating: "Scientists agree climate change does pose a threat to humans and ecosystems, but they do not envision that climate change will obliterate all people from the planet." On a much longer time scale, natural shifts such as Milankovitch cycles could create unknown climate changes. In the even longer term, the Earth will naturally become uninhabitable due to the Sun's stellar evolution, within about a billion years.
- Nuclear or biological warfare; for example, a future arms race may result in larger arsenals than those of the Cold War. Some fear a hypothetical World War III could cause the annihilation of humankind, perhaps by a resulting nuclear winter as has been hypothesized by experts.
- A pandemic involving one or more viruses, prions, or antibiotic-resistant bacteria. Past pandemics include the Spanish flu outbreak in 1918 estimated to have killed 3-5% of the global population, the 14th century Eurasian Black Death pandemic and the various European viruses that decimated indigenous American populations. A deadly pandemic restricted to humans alone would be self-limiting as its mortality would reduce the density of its target population. A pathogen with a broad host range in multiple species, however, could eventually reach even isolated human populations, e.g. when using animals as "carriers". U.S. officials assess that an engineered pathogen capable of "wiping out all of humanity", if left unchecked, is technically feasible and that the technical obstacles are "trivial". However, they are confident that in practice, countries would be able to "recognize and intervene effectively" to halt the spread of such a microbe and prevent human extinction.
- Population decline through a preference for fewer children. If developing world demographics are assumed to become developed world demographics, and if the latter are extrapolated, some projections suggest an extinction before the year 3000. John A. Leslie estimates that if the reproduction rate drops to the German or Japanese level the extinction date will be 2400. However, some models suggest the demographic transition may reverse itself due to evolutionary biology.
- A geological or cosmological disaster such as an impact event of a near-Earth object, a lethal gamma-ray burst in our part of the Milky Way, a supervolcanic eruption, or natural long-term Climate change. Near-Earth objects serve as an absolute threat to the survival of living species. A single extraterrestrial event can lead to widespread species extinctions. However, none of the large "dinosaur-killer" asteroids known to Spaceguard pose a near-term threat of collision with Earth.
Habitat threats
- Human activity has triggered an extinction event often referred to as the sixth "mass extinction". The 2019 Global Assessment Report on Biodiversity and Ecosystem Services, published by the United Nations' Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services, asserts that roughly one million species of plants and animals face extinction from human impacts such as expanding land use for industrial agriculture and livestock rearing, along with overfishing. A 1997 assessment states that over a third of Earth's land has been modified by humans, that atmospheric carbon dioxide has increased around 30 percent, that humans are the dominant source of nitrogen fixation, that humans control most of the Earth's accessible surface fresh water, and that species extinction rates may be over a hundred times faster than normal. The Global Footprint Network estimates that current activity uses resources twice as fast as they can be naturally replenished, and that growing human population and increased consumption pose the risk of resource depletion and a concomitant population crash. Evidence suggests birth rates may be rising in the 21st century in the developed world. Projections vary; researcher Hans Rosling has projected population growth to start to plateau around 11 billion, and then to slowly grow or possibly even shrink thereafter. A 2014 study published in Science asserts that the human population will grow to around 11 billion by 2100 and that growth will continue into the next century.
- In around 1 billion years from now, the Sun's brightness may increase as a result of a shortage of hydrogen, and the heating of its outer layers may cause the Earth's oceans to evaporate, leaving only minor forms of life. Well before this time, the level of carbon dioxide in the atmosphere will be too low to support plant life, destroying the foundation of the food chains. See Future of the Earth.
- About 7–8 billion years from now, if and after the Sun has become a red giant, the Earth will probably be engulfed by an expanding Sun and destroyed.
- According to standard physics, the entire universe over much, much larger timescales will become gradually uninhabitable, resulting eventually in unavoidable human extinction associated with the heat death of the universe.
Scientific accidents
- The creators of a superintelligent entity could inadvertently give it goals that lead it to annihilate the human race.
- Uncontrolled nanotechnology incidents resulting in the destruction of the Earth's ecosystem.
- Creation of a micro black hole on Earth during the course of a scientific experiment, or other unlikely scientific accidents in high-energy physics research, such as vacuum phase transition or strangelet incidents. There were worries concerning the Large Hadron Collider at CERN as it is feared that collision of protons at near the speed of light will result in the creation of a black hole, but it has been pointed out that much more energetic collisions take place currently in Earth's atmosphere.
Further scenarios of extraterrestrial origin
- Invasion by militarily superior extraterrestrials – often considered to be a scenario purely from the realm of science fiction, professional SETI researchers have given serious consideration to this possibility, but conclude that it is unlikely.
Evolution of a posthuman species
Perception of and reactions to human extinction risk
Probability estimates
Nick Bostrom argues that it would be "misguided" to assume that the probability of near-term extinction is less than 25% and that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure. A little more optimistically, philosopher John Leslie assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical doomsday argument that Leslie champions. Leslie's argument is somewhat frequentist, based on the observation that human extinction has never observed, but requires subjective anthropic arguments. Leslie also discusses the anthropic survivorship bias and states that the a priori certainty of observing an "undisastrous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen's formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe."Some scholars believe that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to McMurdo Station in Antarctica, which has contingency plans and supplies for a long isolation. In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war. Any number of events could lead to a massive loss of human life; but if the last few most resilient humans are unlikely to also die off, then that particular human extinction scenario may not seem credible.
Psychology
theorizes that scope neglect plays a role in public perception of existential risks:Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of an existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".
All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the lack of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.
Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth."
Research and initiatives
Psychologist Steven Pinker calls existential risk a "useless category" that can distract real threats such as climate change and nuclear war. In contrast, other researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed". As of 2020, the Biological Weapons Convention organization has an annual budget of US$1.4 million.Although existential risks are less manageable by individuals than, e.g. health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."
Multiple organizations with the goal of helping prevent human extinction exist. Examples are the Future of Humanity Institute, the Centre for the Study of Existential Risk, the Future of Life Institute, the Machine Intelligence Research Institute, and the Global Catastrophic Risk Institute.
Omnicide
Omnicide is human extinction as a result of human action. Most commonly it refers to extinction through nuclear warfare or biological warfare, but it can also apply to extinction through means such as a global anthropogenic ecological catastrophe. Some philosophers, among them the antinatalist David Benatar, animal rights activist Steven Best and anarchist Todd May, posit that human extinction would be a positive thing for the other organisms on the planet, and the planet itself, citing for example the omnicidal nature of human civilization.Proposed countermeasures
advocated colonizing other planets within the solar system once technology progresses sufficiently, in order to improve the chance of human survival from planet-wide events such as global thermonuclear war.More economically, some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving global disaster. Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.