Human extinction


In futures studies, human extinction is the hypothetical complete end of the human species. This may result either from natural causes or due to anthropogenic causes, but the risks of extinction through natural disaster, such as a meteorite impact or large-scale volcanism, are generally considered to be comparatively low.
Many possible scenarios of anthropogenic extinction have been proposed, such as climate change, global nuclear annihilation, biological warfare and ecological collapse. Some scenarios center on emerging technologies, such as advanced artificial intelligence, biotechnology, or self-replicating nanobots. The probability of anthropogenic human extinction within the next hundred years is the topic of an active debate.

Moral arguments regarding existential risk

"Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress. Multiple scholars have argued based on the size of the "cosmic endowment" that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value. Some of the arguments run as follows:
Parfit argues that the size of the "cosmic endowment" can be calculated from the following argument: If Earth remains habitable for a billion more years and can sustainably support a population of more than a billion humans, then there is a potential for 10 human lives of normal duration. Bostrom goes further, stating that if the universe is empty, then the accessible universe can support at least 10 biological human life-years; and, if some humans were uploaded onto computers, could even support the equivalent of 10 cybernetic human life-years.

Proposed scenarios

Severe forms of known or recorded disasters

Some scenarios involve extinction as a result of the effects or use of totally new technologies. Scenarios include:
Some scenarios envision that humans could use genetic engineering or technological modifications to split into normal humans and a new species – posthumans. Such a species could be fundamentally different from any previous life form on Earth, e.g. by merging humans with technological systems. Such scenarios assess the risk that the "old" human species will be outcompeted and driven to extinction by the new, posthuman entity.

Perception of and reactions to human extinction risk

Probability estimates

Nick Bostrom argues that it would be "misguided" to assume that the probability of near-term extinction is less than 25% and that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure. A little more optimistically, philosopher John Leslie assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical doomsday argument that Leslie champions. Leslie's argument is somewhat frequentist, based on the observation that human extinction has never observed, but requires subjective anthropic arguments. Leslie also discusses the anthropic survivorship bias and states that the a priori certainty of observing an "undisastrous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen's formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe."
Some scholars believe that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to McMurdo Station in Antarctica, which has contingency plans and supplies for a long isolation. In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war. Any number of events could lead to a massive loss of human life; but if the last few most resilient humans are unlikely to also die off, then that particular human extinction scenario may not seem credible.

Psychology

theorizes that scope neglect plays a role in public perception of existential risks:
Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of an existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".

All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the lack of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.
Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth."

Research and initiatives

Psychologist Steven Pinker calls existential risk a "useless category" that can distract real threats such as climate change and nuclear war. In contrast, other researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed". As of 2020, the Biological Weapons Convention organization has an annual budget of US$1.4 million.
Although existential risks are less manageable by individuals than, e.g. health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the "universal" Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."
Multiple organizations with the goal of helping prevent human extinction exist. Examples are the Future of Humanity Institute, the Centre for the Study of Existential Risk, the Future of Life Institute, the Machine Intelligence Research Institute, and the Global Catastrophic Risk Institute.

Omnicide

Omnicide is human extinction as a result of human action. Most commonly it refers to extinction through nuclear warfare or biological warfare, but it can also apply to extinction through means such as a global anthropogenic ecological catastrophe. Some philosophers, among them the antinatalist David Benatar, animal rights activist Steven Best and anarchist Todd May, posit that human extinction would be a positive thing for the other organisms on the planet, and the planet itself, citing for example the omnicidal nature of human civilization.

Proposed countermeasures

advocated colonizing other planets within the solar system once technology progresses sufficiently, in order to improve the chance of human survival from planet-wide events such as global thermonuclear war.
More economically, some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving global disaster. Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.

In popular culture

Some 21st century pop-science works, including The World Without Us by Alan Weisman, pose an artistic thought experiment: what would happen to the rest of the planet if humans suddenly disappeared? A threat of human extinction, such as through a technological singularity, drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of When Worlds Collide. Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and Steven Spielberg's A.I.