Human extinction

From Wikipedia, the free encyclopedia

Jump to: navigation, search

Human extinction is the assured end of the human species. Various scenarios have been discussed in science, popular culture, and religion (see End time). The breadth of this article is on existential risks.

Humans are very widespread on the Earth, and live in communities which (whilst interconnected) are capable of some kind of basic survival in isolation. Therefore, pandemic and deliberate killing aside, to achieve human extinction, the entire planet would have to be rendered uninhabitable. This would typically be during a mass extinction event, a precedent of which exists in the Permian–Triassic extinction event among other examples.

In the near future, two anthropogenic scenarios exist: catastrophic climate change, which seems increasingly likely, and all-out nuclear war which may well result from catastrophic climate change as nation states become increasingly desperate, and two possible natural ones: bolide impact and large-scale volcanism. Both of these have occurred repeatedly in the geologic past and there is no reason to consider them unlikely in the future. As technology develops, there is a possibility that humans may be deliberately destroyed by the actions of a rogue state or individual in a form of global suicide attack. A more likely scenario is the emergence of a pandemic of such virulence and infectiousness that very few humans survive the disease. While not actually a human extinction event, this may leave only very small, very scattered human populations that would then evolve in isolation.

It is important to differentiate between human extinction and the extinction of life on Earth. Of possible extinction events, only pandemic is selective enough to eliminate humanity while leaving the rest of life on earth relatively unscathed.

Contents

[edit] Possible scenarios

  • Severe forms of known or recorded disasters
  • Environmental collapses
  • Long term habitat threats
    • There is a 1% chance that during the life of our solar system the gravitational force from Jupiter may have exerted enough work on Mercury to perturb its orbit enough to cross the orbital path of Venus.[citation needed] Were this to happen, Mercury could be sent off its orbit when it gets critically close to Venus (see gravitational slingshot). Then, the planet may collide with Earth (though more likely colliding with Venus or the Sun, or simply leaving the solar system), wiping out any forms of life, including humans.
    • Within a million years, the hypergiant Eta Carinae, which is 7500 light years from the Sun, may go hypernova.
    • In 1.4 million years Gliese 710 will be only 1.1 light years from Earth and might catastrophically perturb the Oort cloud, possibly resulting in a comet shower.
    • In about 3 billion years, our Milky Way galaxy is expected to collide with the Andromeda galaxy. Collisions of individual bodies will likely be scarce; however, the consequences for orbits of stars and planets are unclear, and impossible to predict for individual stellar systems.
    • In 5 billion years hence the Sun's stellar evolution will reach the red giant stage, in which it will expand and engulf Earth. But before this happens it will already have changed Earth's climate and its radiated spectrum may alter in ways Earth-bound humans could not survive.[1]
    • In the far future the main risks to human survival could be heat death and cooling with the expansion of the universe.
  • Evolution of humanity into a posthuman life-form or existence by means of technology, leaving no trace of original humans
  • Evolution of humanity into another hominid species. Humans will continue to evolve via traditional natural selection over a period of millions of years, and homo sapiens will gradually transition into one or more new species. This mechanism for the extinction of Homo sapiens would, however, require that regional interbreeding ceases for tens of thousands of years.
  • Dysgenics among humanity resulting in a less intelligent species. (See Idiocracy.)
  • Population decline
    • Preference for fewer children; if developed world demographics are extrapolated they mathematically lead to 'soft' extinction before 3000 AD. (John Leslie estimates that if the reproduction rate drops to the German level the extinction date will be 2400[2]).
    • Political intervention in reproduction has failed to raise the birth rate above the replacement level in the rich world, but has dramatically succeeded in lowering it below the replacement level in China[citation needed] (see One child policy). A World government with a eugenic or small population policy could send humanity into 'voluntary' extinction.
    • Infertility: Caused by hormonal disruption from the chemical/pharmaceutical industries, or biological changes, such as the (controversial) findings of falling sperm cell count in human males. (See The Children of Men (novel) or Children of Men (film).)
    • A disruption, chemical, biological, or otherwise, in humans' ability to reproduce properly or at all
    • Disease: The 'weak-gened' and birth-defected are kept alive by medicines. This is the opposite of nature, where the weak are less likely to survive and successfully reproduce, leaving the species genetically 'strong'. Eventually everyone has weak/flawed genes, and these defects become increasingly severe, until the human body is unable to fight diseases, even with the help of advanced medicine. In the end, disease ends the human species[citation needed]. Arguably however if this point was reached natural selection would again become a factor, potentially reversing this 'decline'.
    • Voluntary extinction
  • Scientific accidents
    • In his book Our Final Hour, Sir Martin Rees claims that without the appropriate regulation, scientific advancement increases the risk of human extinction as a result of the effects or use of new technology. Some examples are provided below.
      • Uncontrolled nanotechnology (grey goo) incidents resulting in the destruction of the Earth's ecosystem (ecophagy).
      • Creation of a naked singularity (such as a "micro black hole") on Earth during the course of a scientific experiment, or other foreseeable scientific accidents in high-energy physics research, such as vacuum phase transition or strangelet incidents. There were worries concerning the Large Hadron Collider at CERN as it is feared that collision of protons at a speed near the speed of light will result in the creation of a black hole, but it has been pointed out that much more energetic collisions take place currently in Earth's atmosphere.
      • The world's food supply being threatened and extinguished as a result of scientific tampering, due to genetic engineering encouraging more prevalent plant diseases, or the interference of pesticides contributing to the destruction of crops due to the effect on bees (See colony collapse disorder.)
    • Accidental contact of an alien civilization by Earth's radio and TV signals, radar, Internet tech dependent on radio, TV signals, other signals.
    • Biotech disaster such as green goo. (e.g. the warnings of Jeremy Rifkin)
  • Scenarios of extraterrestrial origin
    • Major impact events.
    • If a rogue black hole passed near the Sun, it could disrupt Earth's orbit.
    • Gamma-ray burst in our part of the Milky Way (Bursts observable in other galaxies are calculated to act as a "sterilizer", and have been used by some astronomers to explain the Fermi paradox). The lack of fossil record interruptions, and relative distance of the nearest Hypernova candidate make this a long term (rather than imminent) threat.
      • Wolf-Rayet star WR 104, which is 8000 light years from the Sun, may produce a gamma ray burst aimed at the Sun when it goes supernova.
    • Invasion by militarily superior aliens (see alien invasion) — often considered to be a scenario purely from the realms of science fiction, professional SETI researchers have given serious consideration to this possibility, but conclude that it is unlikely. [3]
    • Gerard O'Neill has cautioned that first contact with alien intelligence may follow the precedent set by historical examples of contact between human civilizations, where the less technologically-advanced civilization has inevitably succumbed to the other civilization, regardless of its intentions.
    • Solar flares may suddenly heat the earth, or the light from the sun may be blocked by dust, slowly freezing it (eg. the dust and vapour may come from a Kuiper belt disturbance).
    • A vacuum phase transition could destroy the universe.
    • It is possible that the space of our universe, the Big Bang, and all its consequences are events taking place within a computer or other device on another cosmological plane, if this process were to end then everything within the universe would summarily vanish (see Simulated Reality).
  • Philosophical scenarios

[edit] Attitudes to human extinction

Attitudes to human extinction vary widely depending on beliefs concerning spiritual survival (souls, heaven, reincarnation, and so forth), the value of the human species, whether the human species evolves individually or collectively, and many other factors. Many religions prophesy an "end times" to the universe. Human extinction is therefore a part of the faith of many humans to the extent that the end time means the absolute end of their physical humanity but perhaps not an internal soul.

However not all faiths connect human extinction to the end times, since some believe in cyclical regeneration, or that end times actually means the beginning of a new kind of existence (see eschatology and utopianism).

[edit] Perception of human extinction risk

The general level of fear about human extinction, in the near term, is very low, despite the pronouncements of some fringe groups. It is not an outcome considered by many as a credible risk. Suggested reasons for human extinction's low public visibility:

  1. There have been countless prophesies of extinction throughout history; in all cases the predicted date of doom has passed without much notice, making future warnings less frightening. However, a survivor bias would undercut the credibility of accurate extinction warnings. John von Neumann was probably wrong in having “a certainty”[4] that nuclear war would occur; but our survival is not proof that the chance of a fatal nuclear exchange was low (or indeed that such an event could not occur in the future).
  2. Extinction scenarios (see below) are speculative, and hard to quantify. A frequentist approach to probability cannot be used to assess the danger of an event that has never been observed by humans.
  3. Nick Bostrom, head of the James Martin 21st Century School Future of Humanity Institute, has suggested that extinction risk-analysis may be an overlooked field because it is both too psychologically troublesome a subject area to be attractive to potential researchers and because the lack of previous human species extinction events leads a depressed view of the likelihood of it happening under changing future circumstances (an 'inverse survivorship bias').
  4. There are thousands of public safety jobs dedicated to analyzing and reducing the risks of individual death. There are no full-time existential safety commissioners partly because there is no way to tell if they are doing a good job, and no way to punish them for failure. The inability to judge performance might also explain the comparative governmental apathy on preventing human extinction (as compared to panda extinction, say).
  5. Some anthropologists believe that risk perception is biased by social structure; in the "Cultural Theory of risk" typography "individualist" societies predispose members to the belief that nature operates as a self-correcting system, which will return to its stable state after a disturbance. People in such cultures feel comfortable with a "trial-and-error" approach to risk, even to unsuitably rare dangers (such as extinction events).
  6. It is possible to do something about dietary or motor-vehicle health threats. Since it is much harder to know how existential threats should be minimized[5], they tend to be ignored. High technology societies tend to become "hierarchist" or "fatalist" in their attitudes to the ever-multiplying risks threatening them. In either case, the average member of society adopts a passive attitude to risk minimization, culturally, and psychologically.
  7. The bias in popular culture is to relate extinction scenario stories with non-extinction outcomes. (None of the 16 'most notable' WW3 scenarios in film are resolved by human extinction, for example.[6])
  8. The threat of nuclear annihilation actually was a daily concern in the lives of many people in the 1960s and 1970s. Since then the principal fear has been of localized terrorist attack, rather than a global war of extinction; contemplating human extinction may be out of fashion.
  9. Many people believe that if human extinction did occur, the amount of research done prior to the event would be irrelevant (as humanity would no longer exist).
  10. Some people have philosophical reasons for doubting the possibility of human extinction, for instance the final anthropic principle, plenitude principle or intrinsic finality.
  11. Tversky and Kahneman have produced evidence that humans suffer cognitive biases which would tend to minimize the perception of this unprecedented event:
    1. Denial is a negative "availability heuristic" shown to occur when an outcome is so upsetting that the very act of thinking about it leads to an increased refusal to believe it might occur. In this case, imagining human extinction probably makes it seem less likely.
    2. In cultures where human extinction is not expected the proposition must overcome the "disconfirmation bias" against heterodox theories.
    3. Another reliable psychological effect relevant here is the "positive outcome bias".
    4. Behavioural finance has strong evidence that recent evidence is given undue significance in risk analysis. Roughly speaking, "100 year storms" tend to occur every twenty years in the stock market as traders become convinced that the current good times will last forever. Doomsayers who hypothesize rare crisis-scenarios are dismissed even when they have statistical evidence behind them. An extreme form of this bias can diminish the subjective probability of the unprecedented[7].

In general, humanity's sense of self preservation, and intelligence are considered to offer safe-guards against extinction. It is felt that people will find creative ways to overcome potential threats, and will take care of the precautionary principle in attempting dangerous innovations. The arguments against this are; firstly, that the management of destructive technology is becoming difficult, and secondly, that the precautionary principle is often abandoned whenever the reward appears to outweigh the risk. At least one instance where the principle may have been overruled was when prior to the Trinity nuclear test, one of the project's scientists (Teller) speculated that the fission explosion might destroy New Mexico and possibly the world, by causing a reaction in the nitrogen of the atmosphere. A calculation by Hans Bethe proved such a possibility theoretically impossible, but the fear of the possibility remained among some until the test took place. (See Ignition of the atmosphere with nuclear bombs, LA-602, online and Manhattan Project).

[edit] Observations about human extinction

The fact the vast majority of species that have existed on Earth have become extinct, has led to the suggestion that all species have a finite lifespan and thus human extinction would be inevitable. Dave Raup and Jack Sepkoski found for example a twenty six million year periodicity in elevated extinction rates, caused by factors unknown (See David M. Raup. "Extinction: Bad Genes or Bad Luck" (1992, Norton). Based upon evidence of past extinction rates Raup and others have suggested that the average longevity of an invertebrate species is between 4-6 million years, while that of vertebrates seems to be 2-4 million years. The shorter period of survival for mammals lies in their position further up the food chain than many invertebrates, and therefore an increased liability to suffer the effects of environmental change. A counter-argument to this is that humans are unique in their adaptive and technological capabilities, so it is not possible to draw reliable inferences about the probability of human extinction based on the past extinctions of other species. Certainly, the evidence collected by Raup and others suggested that generalist, geographically dispersed species, like humans, generally have a lower rate of extinction than those species that require a particular habitat. In addition, the human species is probably the only species with a conscious prior knowledge of their own demise, and therefore would be likely to take steps to avoid it.

Another characteristic of the human that may be unique is its religious belief, which in most situations encourages respect for life. On the other hand, it may also create conditions for warfare and genocide. As a result, thinkers as Albert Einstein believed that "We shall require a substantially new manner of thinking if mankind is to survive."[2]

Humans are very similar to other primates in their propensity towards intra-species violence; Jared Diamond's The Third Chimpanzee (ISBN 0-09-980180-9) estimates that 64% of hunter-gather societies engage in warfare every two years. Although it has been argued (e.g. in the UNESCO Seville Statement) that warfare is a cultural artifact, many anthropologists[citation needed] dispute this, noting that small human tribes exhibit similar patterns of violence to chimpanzee groups, the most murderous of the primates, and our nearest living genetic relatives. The 'higher' functions of reason and speech are more developed in the brain of Homo sapiens than other primates, but the relative size of the limbic system is a constant in apes, monkeys and humans; as human rational faculties have expanded, so has the wetware of emotion. The combination of inventiveness and urge to violence in humans has been cited as evidence against its long term survival[8].[opinion needs balancing]

[edit] Omnicide

Omnicide is human extinction as a result of human action. Most commonly it refers to extinction through nuclear warfare,[3][4][5] but it can also apply to extinction through means such as global anthropogenic ecological catastrophe.[6]

Omnicide can be considered a subcategory of genocide.[7] Using the concept in this way, one can argue, for example, that:

The arms race is genocidal in intent given the fact that the United States and the Soviet Union are knowingly preparing to destroy each other as viable national and political groups.[8]

As this claim illustrates, the concept of omnicide raises issues of human agency and, hence, of moral responsibility in discussions about large-scale social processes like the nuclear arms race or ecologically destructive industrial production. That is, part of the point of describing a human extinction scenario as 'omnicidal' is to note that, if it were to happen, it would result not just from natural, uncontrollable evolutionary forces, or from some random catastrophe like an asteroid impact, but from deliberate choices made by human beings. This implies that such scenarios are preventable, and that the people whose choices make them more likely to happen should be held morally accountable for such choices. In this context, the label 'omnicide' also works to de-normalize the course of action it is applied to.

[edit] Scenarios of the world without humans

The book The World Without Us by Alan Weisman deals with a thought experiment on what would happen to the planet and especially man-made infrastructures if humans suddenly disappeared. Alan said that apes, with the highest IQ amongst animals other than humans, may be the species that succeeds humanity. The Discovery Channel film The Future Is Wild shows the possible future of evolution on Earth without humans. The History Channel two-hour special Life After People examines the possible future of life on Earth without humans. The National Geographic Channel ran a special called Aftermath: Population Zero envisioning what the world be like if all humans suddenly disappeared.

[edit] See also

[edit] Further reading

[edit] References

  1. ^ Warwick, K: “I,Cyborg”, University of Illinois Press, 2004
  2. ^ The Nobel Peace Prize 1985 - Presentation Speech
  3. ^ Somerville, John. 1981. Soviet Marxism and nuclear war : an international debate : from the proceedings of the special colloquium of the XVth World Congress of Philosophy. Greenwood Press. Pg.151
  4. ^ Goodman, Lisl Marburg and Lee Ann Hoff. 1990. Omnicide: The Nuclear Dilemma. New York: Praeger.
  5. ^ Landes, Daniel (ed.). 1991. Confronting Omnicide: Jewish Reflections on Weapons of Mass Destruction. Jason Aronson Publishers.
  6. ^ Wilcox, Richard Brian. 2004. The Ecology of Hope: Environmental Grassroots Activism in Japan. Ph.D. Dissertation, Union Institute & University, College of Graduate Studies. Page 55.
  7. ^ Jones, Adam. 2006. "A Seminal Work on Genocide", in Security Dialogue, vol. 37(1), pp. 143-144.
  8. ^ Santoni, Ronald E., 1987. "Genocide, Nuclear Omnicide, and Individual Responsibility" in Social Science Record, vol. 24(2), pp.38-41.

[edit] Notes

^  Von Neumann said it was "absolutely certain (1) that there would be a nuclear war; and (2) that everyone would die in it" (underline added to quote from: The Nature of the Physical Universe – 1979, John Wiley & Sons, ISBN 0-471-03190-9, in H. Putnam’s essay The place of facts in a world of values - page 113). This example illustrates why respectable scientists are very reluctant to go on record with extinction predictions: they can never be proven right. (The quotation is repeated by Leslie (1996) on page 26, on the subject of nuclear war annihilation, which he still considered a significant risk – in the mid 1990s.)

^  Although existential risks are less manageable by individuals than health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the “universal” Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "...you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life then we believe." Source: “Practical application” page 39 of the Princeton University paper: Philosophical Implications of Inflationary Cosmology

^  The 2000 review Armageddon at the Millennial Dawn from The Journal of Religion and Film finds that "While end of the world threats perhaps are not avoidable, the cinematic formulation of millennial doom promotes the notion that the end can be averted through employing human ingenuity, scientific advance, and heroism." Since this review was conducted, there had been a Hollywood production which postulates a (far future) outcome where humans are extinct (at least in the wild): A.I..

^  For research on this, see Psychological science volume 15 (2004): Decisions From Experience and the Effect of Rare Events in Risky Choice. The under-perception of rare events mentioned above is actually the opposite of the phenomenon originally described by Kahneman in "prospect theory" (in their original experiments the likelihood of rare events is over-estimated). However, further analysis of the bias has shown that both forms occur: When judging from description people tend to over-estimate the described probability, so this effect taken alone would indicate that reading the extinction scenarios described here should make the reader over-estimate the likelihood of any probabilities given. However, the effect that is more relevant to common consideration of human extinction is the bias that occurs with estimates from experience, and these are in the opposite direction: When judging from personal experience people who have never heard of or experienced their species become extinct would be expected to dramatically under-estimate its likelihood. Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth." (Is Humanity Suicidal? New York Times Magazine May 30, 1993).

^  Abrupt.org 1996 editorial lists (and condemns) the arguments for human’s tendency to self-destruction. In this view, the history of humanity suggests that humans will be the cause of their own extinction. However, others have reached the opposite conclusion with the same data on violence and hypothesize that as societies develop armies and weapons with greater destructive power, they tend to be used less often. It is claimed that this implies a more secure future, despite the development of WMD technology. As such this argument may constitute a form of deterrence theory. Counter-arguments against such views include the following: (1) All weapons ever designed have ultimately been used. States with strong military forces tend to engage in military aggression, (2) Although modern states have so far generally shown restraint in unleashing their most potent weapons, whatever rational control was guaranteed by government monopoly over such weapons becomes increasingly irrelevant in a world where individuals have access to the technology of mass destruction (as proposed in Our Final Hour, for example).

^  ReligiousTolerance.org says that Aum Supreme Truth is the only religion known to have planned Armageddon for non-believers. Their intention to unleash deadly viruses is covered in Our Final Hour, and by Aum watcher, Akihiko Misawa. The Gaia Liberation Front advocates (but is not known to have active plans for) total human genocide, see: GLF, A Modest Proposal. Leslie, 1996 says that Aum’s collection of nuclear physicists presented a doomsday threat from nuclear destruction as well, especially as the cult included a rocket scientist.

^  Leslie (1996) discusses the survivorship bias (which he calls an "observational selection" effect on page 139) he says that the a priori certainty of observing an "undisasterous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen’s formulation: “We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe.” (From: Random dynamics and relations between the number of fermion generations and the fine structure constants, Acta Pysica Polonica B, May 1989).

^  For example, in the essay Why the future doesn't need us, computer scientist Bill Joy argued that human beings are likely to guarantee their own extinction through transhumanism. See: Wired archive, Why the future doesn't need us.

^  For the “West Germany” extrapolation see: Leslie, 1996 (The End of the World) in the “War, Pollution, and disease” chapter (page 74). In this section the author also mentions the success (in lowering the birth rate) of programs such as the sterilization-for-rupees programs in India, and surveys other infertility or falling birth-rate extinciton scenarios. He says that the voluntary small family behaviour may be counter-evolutionary, but that the meme for small, rich families appears to be spreading rapidly throughout the world. In 2150 the world population is expected to start falling.

^ See estimate of contact’s probability at galactic-guide. Former NASA consultant David Brin's lengthy rebuttal to SETI enthusiast's optimism about alien intentions concludes: "The worst mistake of first contact, made throughout history by individuals on both sides of every new encounter, has been the unfortunate habit of making assumptions. It often proved fatal." (See full text at SETIleague.org.)

Personal tools