fbpx
Wikipedia

Human extinction

Human extinction is the hypothetical end of the human species, either by population decline due to extraneous natural causes, such as an asteroid impact or large-scale volcanism, or via anthropogenic destruction (self-extinction), for example by sub-replacement fertility.

Nuclear war is an often-predicted cause of the extinction of mankind.[1]

Some of the many possible contributors to anthropogenic hazard are climate change, global nuclear annihilation, biological warfare, weapons of mass destruction, and ecological collapse. Other scenarios center on emerging technologies, such as advanced artificial intelligence, biotechnology, or self-replicating nanobots.

The scientific consensus is that there is a relatively low risk of near-term human extinction due to natural causes.[2][3] The likelihood of human extinction through mankind's own activities, however, is a current area of research and debate.

History of thought edit

Early history of thinking about human extinction edit

Before the 18th and 19th centuries, the possibility that humans or other organisms could become extinct was viewed with scepticism.[4] It contradicted the principle of plenitude, a doctrine that all possible things exist.[4] The principle traces back to Aristotle, and was an important tenet of Christian theology.[5] Ancient philosophers such as Plato, Aristotle, and Lucretius wrote of the end of mankind only as part of a cycle of renewal. Marcion of Sinope was a proto-protestant who advocated for antinatalism that could lead to human extinction.[6][7] Later philosophers such as Al-Ghazali, William of Ockham, and Gerolamo Cardano expanded the study of logic and probability and began wondering if abstract worlds existed, including a world without humans. Physicist Edmond Halley stated that the extinction of the human race may be beneficial to the future of the world.[8]

The notion that species can become extinct gained scientific acceptance during the Age of Enlightenment in the 17th and 18th centuries, and by 1800 Georges Cuvier had identified 23 extinct prehistoric species.[4] The doctrine was further gradually undermined by evidence from the natural sciences, particularly the discovery of fossil evidence of species that appeared to no longer exist, and the development of theories of evolution.[5] In On the Origin of Species, Darwin discussed the extinction of species as a natural process and a core component of natural selection.[9] Notably, Darwin was skeptical of the possibility of sudden extinction, viewing it as a gradual process. He held that the abrupt disappearances of species from the fossil record were not evidence of catastrophic extinctions, but rather represented unrecognised gaps[clarification needed] in the record.[9]

As the possibility of extinction became more widely established in the sciences, so did the prospect of human extinction.[4] In the 19th century, human extinction became a popular topic in science (e.g., Thomas Robert Malthus's An Essay on the Principle of Population) and fiction (e.g., Jean-Baptiste Cousin de Grainville's The Last Man). In 1863, a few years after Charles Darwin published On the Origin of Species, William King proposed that Neanderthals were an extinct species of the genus Homo. The Romantic authors and poets were particularly interested in the topic.[4] Lord Byron wrote about the extinction of life on Earth in his 1816 poem "Darkness", and in 1824 envisaged humanity being threatened by a comet impact, and employing a missile system to defend against it.[4] Mary Shelley's 1826 novel The Last Man is set in a world where humanity has been nearly destroyed by a mysterious plague.[4] At the turn of the 20th century, Russian cosmism, a precursor to modern transhumanism, advocated avoiding humanity's extinction by colonizing space.[4]

Atomic era edit

 
Castle Romeo nuclear test on Bikini Atoll

The invention of the atomic bomb prompted a wave of discussion among scientists, intellectuals, and the public at large about the risk of human extinction.[4] In a 1945 essay, Bertrand Russell wrote that "[T]he prospect for the human race is sombre beyond all precedent. Mankind are faced with a clear-cut alternative: either we shall all perish, or we shall have to acquire some slight degree of common sense."[10] In 1950, Leo Szilard suggested it was technologically feasible to build a cobalt bomb that could render the planet unlivable. A 1950 Gallup poll found that 19% of Americans believed that another world war would mean "an end to mankind".[11] Rachel Carson's 1962 book Silent Spring raised awareness of environmental catastrophe. In 1983, Brandon Carter proposed the Doomsday argument, which used Bayesian probability to predict the total number of humans that will ever exist.

The discovery of "nuclear winter" in the early 1980s, a specific mechanism by which nuclear war could result in human extinction, again raised the issue to prominence. Writing about these findings in 1983, Carl Sagan argued that measuring the severity of extinction solely in terms of those who die "conceals its full impact", and that nuclear war "imperils all of our descendants, for as long as there will be humans."[12]

Post Cold War edit

John Leslie's 1996 book The End of The World was an academic treatment of the science and ethics of human extinction. In it, Leslie considered a range of threats to humanity and what they have in common. In 2003, British Astronomer Royal Sir Martin Rees published Our Final Hour, in which he argues that advances in certain technologies create new threats to the survival of mankind and that the 21st century may be a critical moment in history when humanity's fate is decided.[13] Edited by Nick Bostrom and Milan M. Ćirković, Global Catastrophic Risks was published in 2008, a collection of essays from 26 academics on various global catastrophic and existential risks.[14] Toby Ord's 2020 book The Precipice: Existential Risk and the Future of Humanity argues that preventing existential risks is one of the most important moral issues of our time. The book discusses, quantifies, and compares different existential risks, concluding that the greatest risks are presented by unaligned artificial intelligence and biotechnology.[15]

Causes edit

Potential anthropogenic causes of human extinction include global thermonuclear war, deployment of a highly effective biological weapon, an ecological collapse, runaway artificial intelligence, runaway nanotechnology (such as a grey goo scenario), a scientific accident involving a micro black hole or vacuum metastability disaster, overpopulation and increased consumption pose the risk of resource depletion and a concomitant population crash, population decline by choosing to have fewer children, displacement of naturally evolved humans by a new species produced by genetic engineering or technological augmentation. Natural and external extinction risks include high-fatality-rate pandemic, supervolcanic eruption, asteroid impact, nearby supernova or gamma-ray bursts, extreme solar flare, or alien invasion.

Humans (e.g. Homo sapiens sapiens) as a species may also be considered to have "gone extinct" simply by being replaced with distant descendants whose continued evolution may produce new species or subspecies Homo or of hominids.

Without intervention by unexpected forces, the stellar evolution of the Sun is expected to make Earth uninhabitable, then destroy it. Depending on its ultimate fate, the entire universe may eventually become uninhabitable.

Probability edit

Natural vs. anthropogenic edit

Experts generally agree that anthropogenic existential risks are (much) more likely than natural risks.[16][13][17][2][18] A key difference between these risk types is that empirical evidence can place an upper bound on the level of natural risk.[2] Humanity has existed for at least 200,000 years, over which it has been subject to a roughly constant level of natural risk. If the natural risk were sufficiently high, then it would be highly unlikely that humanity would have survived as long as it has. Based on a formalization of this argument, researchers have concluded that we can be confident that natural risk is lower than 1 in 14,000 per year (equivalent to 1 in 140 per century, on average).[2]

Another empirical method to study the likelihood of certain natural risks is to investigate the geological record.[16] For example, a comet or asteroid impact event sufficient in scale to cause an impact winter that would cause human extinction before the year 2100 has been estimated at one-in-a-million.[19][20] Moreover, large supervolcano eruptions may cause a volcanic winter that could endanger the survival of humanity.[21] The geological record suggests that supervolcanic eruptions are estimated to occur on average about once every 50,000 years, though most such eruptions would not reach the scale required to cause human extinction.[21] Famously, the supervolcano Mt. Toba may have almost wiped out humanity at the time of its last eruption (though this is contentious).[21][22]

Since anthropogenic risk is a relatively recent phenomenon, humanity's track record of survival cannot provide similar assurances.[2] Humanity has only survived 78 years since the creation of nuclear weapons, and for future technologies, there is no track record at all. This has led thinkers like Carl Sagan to conclude that humanity is currently in a "time of perils"[23] – a uniquely dangerous period in human history, where it is subject to unprecedented levels of risk, beginning from when humans first started posing risk to themselves through their actions.[16][24]

Risk estimates edit

Given the limitations of ordinary observation and modeling, expert elicitation is frequently used instead to obtain probability estimates.[25]

  • Humanity has a 95% probability of being extinct in 7,800,000 years, according to J. Richard Gott's formulation of the controversial doomsday argument, which argues that we have probably already lived through half the duration of human history.[26]
  • In 1996, John A. Leslie estimated a 30% risk over the next five centuries (equivalent to around 6% per century, on average).[27]
  • In 2003, Martin Rees estimated a 50% chance of collapse of civilisation in the twenty-first century.[28]
  • A 2008 survey by the Future of Humanity Institute estimated a 5% probability of extinction by super-intelligence by 2100.[29]
  • The Global Challenges Foundation's 2016 annual report estimates an annual probability of human extinction of at least 0.05% per year (equivalent to 5% per century, on average).[30]
  • A 2016 survey of AI experts found a median estimate of 5% that human-level AI would cause an outcome that was "extremely bad (e.g. human extinction)".[31] In 2019, the risk was lowered to 2%, but in 2022, it was increased back to 5%. In 2023, the risk doubled to 10%.[32]
  • In 2020, Toby Ord estimates existential risk in the next century at "1 in 6" in his book The Precipice: Existential Risk and the Future of Humanity.[16][33] He also estimated a "1 in 10" risk of extinction by unaligned AI within the next century.
  • According to the July 10, 2023 article of The Economist, scientists estimated a 12% chance of AI-caused catastrophe and a 3% chance of AI-caused extinction by 2100. They also estimate a 8% chance of Nuclear War causing global catastrophe and a 0.5625% chance of Nuclear War causing Human Extinction. [34]
  • In May 1, 2023, The Treaty on Artificial Intelligence Safety and Cooperation (TAISC) has estimated a 30.5% risk of an AI-caused catastrophe by 2200, although they also estimate a 32.2% risk of an AI-caused catastrophe by 2026, if there is no 6 month moratorium. [35]
  • As of November 19, 2023, Metaculus users estimate a 1% probability of human extinction by 2100.[36]
  • In a 2010 interview with The Australian, Australian scientist Frank Fenner predicted the extinction of the human race within a century, primarily as the result of human overpopulation, environmental degradation and climate change.[37]
  • According to a 2020 study published in Scientific Reports, if deforestation and resource consumption continue at current rates, they could culminate in a "catastrophic collapse in human population" and possibly "an irreversible collapse of our civilization" in the next 20 to 40 years. According to the most optimistic scenario provided by the study, the chances that human civilization survives are smaller than 10%. To avoid this collapse, the study says, humanity should pass from a civilization dominated by the economy to a "cultural society" that "privileges the interest of the ecosystem above the individual interest of its components, but eventually in accordance with the overall communal interest."[38][39]
  • Nick Bostrom, a philosopher at the University of Oxford known for his work on existential risk, argues that it would be "misguided"[40] to assume that the probability of near-term extinction is less than 25% and that it will be "a tall order" for the human race to "get our precautions sufficiently right the first time", given that an existential risk provides no opportunity to learn from failure.[3][19]
  • Philosopher John A. Leslie assigns a 70% chance of humanity surviving the next five centuries, based partly on the controversial philosophical doomsday argument that Leslie champions. Leslie's argument is somewhat frequentist, based on the observation that human extinction has never been observed, but requires subjective anthropic arguments.[41] Leslie also discusses the anthropic survivorship bias (which he calls an "observational selection" effect on page 139) and states that the a priori certainty of observing an "undisastrous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen's formulation: "We do not even know if there should exist some extremely dangerous decay of say the proton which caused the eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe."[42]
  • Jean-Marc Salotti calculated the probability of human extinction caused by a giant asteroid impact.[43] It is between 0.03 and 0.3 for the next billion years, if there is no colonization of other planets. According to that study, the most frightening object is a giant long-period comet with a warning time of a few years only and therefore no time for any intervention in space or settlement on the Moon or Mars. The probability of a giant comet impact in the next hundred years is 2.2E-12.[43]

Individual vs. species risks edit

Although existential risks are less manageable by individuals than – for example – health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin, the possibility of human extinction does have practical implications. For instance, if the "universal" doomsday argument is accepted, it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "... you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life than we believe."[44]

Difficulty edit

Some scholars argue that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth. Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas, including the underground subways of major cities, the mountains of Tibet, the remotest islands of the South Pacific, and even to McMurdo Station in Antarctica, which has contingency plans and supplies for long isolation.[45] In addition, elaborate bunkers exist for government leaders to occupy during a nuclear war.[19] The existence of nuclear submarines, which can stay hundreds of meters deep in the ocean for potentially years at a time, should also be considered. Any number of events could lead to a massive loss of human life, but if the last few (see minimum viable population) most resilient humans are unlikely to also die off, then that particular human extinction scenario may not seem credible.[46]

Ethics edit

Value of human life edit

 
Placard against omnicide, at Extinction Rebellion (2018)

"Existential risks" are risks that threaten the entire future of humanity, whether by causing human extinction or by otherwise permanently crippling human progress.[3] Multiple scholars have argued based on the size of the "cosmic endowment" that because of the inconceivably large number of potential future lives that are at stake, even small reductions of existential risk have great value.

In one of the earliest discussions of ethics of human extinction, Derek Parfit offers the following thought experiment:[47]

I believe that if we destroy mankind, as we now can, this outcome will be much worse than most people think. Compare three outcomes:

(1) Peace.
(2) A nuclear war that kills 99% of the world's existing population.
(3) A nuclear war that kills 100%.

(2) would be worse than (1), and (3) would be worse than (2). Which is the greater of these two differences? Most people believe that the greater difference is between (1) and (2). I believe that the difference between (2) and (3) is very much greater.

— Derek Parfit

The scale of what is lost in an existential catastrophe is determined by humanity's long-term potential – what humanity could expect to achieve if it survived.[16] From a utilitarian perspective, the value of protecting humanity is the product of its duration (how long humanity survives), its size (how many humans there are over time), and its quality (on average, how good is life for future people).[16]: 273 [48] On average, species survive for around a million years before going extinct. Parfit points out that the Earth will remain habitable for around a billion years.[47] And these might be lower bounds on our potential: if humanity is able to expand beyond Earth, it could greatly increase the human population and survive for trillions of years.[49][16]: 21  The size of the foregone potential that would be lost, were humanity to become extinct, is very large. Therefore, reducing existential risk by even a small amount would have a very significant moral value.[3][50]

Carl Sagan wrote in 1983: "If we are required to calibrate extinction in numerical terms, I would be sure to include the number of people in future generations who would not be born.... (By one calculation), the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss – including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise."[51]

Philosopher Robert Adams in 1989 rejects Parfit's "impersonal" views but speaks instead of a moral imperative for loyalty and commitment to "the future of humanity as a vast project... The aspiration for a better society – more just, more rewarding, and more peaceful... our interest in the lives of our children and grandchildren, and the hopes that they will be able, in turn, to have the lives of their children and grandchildren as projects."[52]

Philosopher Nick Bostrom argues in 2013 that preference-satisfactionist, democratic, custodial, and intuitionist arguments all converge on the common-sense view that preventing existential risk is a high moral priority, even if the exact "degree of badness" of human extinction varies between these philosophies.[53]

Parfit argues that the size of the "cosmic endowment" can be calculated from the following argument: If Earth remains habitable for a billion more years and can sustainably support a population of more than a billion humans, then there is a potential for 1016 (or 10,000,000,000,000,000) human lives of normal duration.[54] Bostrom goes further, stating that if the universe is empty, then the accessible universe can support at least 1034 biological human life-years; and, if some humans were uploaded onto computers, could even support the equivalent of 1054 cybernetic human life-years.[3]

Some economists and philosophers have defended views, including exponential discounting and person-affecting views of population ethics, on which future people do not matter (or matter much less), morally speaking.[55] While these views are controversial,[19][56][57] even they would agree that an existential catastrophe would be among the worst things imaginable. It would cut short the lives of eight billion presently existing people, destroying all of what makes their lives valuable, and most likely subjecting many of them to profound suffering. So even setting aside the value of future generations, there may be strong reasons to reduce existential risk, grounded in concern for presently existing people.[58]

Beyond utilitarianism, other moral perspectives lend support to the importance of reducing existential risk. An existential catastrophe would destroy more than just humanity – it would destroy all cultural artifacts, languages, and traditions, and many of the things we value.[16][59] So moral viewpoints on which we have duties to protect and cherish things of value would see this as a huge loss that should be avoided.[16] One can also consider reasons grounded in duties to past generations. For instance, Edmund Burke writes of a "partnership ... between those who are living, those who are dead, and those who are to be born".[60] If one takes seriously the debt humanity owes to past generations, Ord argues the best way of repaying it might be to 'pay it forward', and ensure that humanity's inheritance is passed down to future generations.[16]: 49–51 

There are several economists who have discussed the importance of global catastrophic risks. For example, Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage.[61] Richard Posner has argued that humanity is doing far too little, in general, about small, hard-to-estimate risks of large-scale catastrophes.[62]

Voluntary extinction edit

 
Voluntary Human Extinction Movement

Some philosophers adopt the antinatalist position that human extinction would not be a bad thing, but a good thing. David Benatar argues that coming into existence is always serious harm, and therefore it is better that people do not come into existence in the future.[63] Further, David Benatar, animal rights activist Steven Best, and anarchist Todd May, posit that human extinction would be a positive thing for the other organisms on the planet, and the planet itself, citing, for example, the omnicidal nature of human civilization.[64][65][66] The environmental view in favor of human extinction is shared by the members of Voluntary Human Extinction Movement and the Church of Euthanasia who call for refraining from reproduction and allowing the human species to go peacefully extinct, thus stopping further environmental degradation.[67]

In fiction edit

Jean-Baptiste Cousin de Grainville's 1805 science fantasy novel Le dernier homme (The Last Man), which depicts human extinction due to infertility, is considered the first modern apocalyptic novel and credited with launching the genre.[68] Other notable early works include Mary Shelley's 1826 The Last Man, depicting human extinction caused by a pandemic, and Olaf Stapledon's 1937 Star Maker, "a comparative study of omnicide".[4]

Some 21st century pop-science works, including The World Without Us by Alan Weisman, and the television specials Life After People and Aftermath: Population Zero pose a thought experiment: what would happen to the rest of the planet if humans suddenly disappeared?[69][70] A threat of human extinction, such as through a technological singularity (also called an intelligence explosion), drives the plot of innumerable science fiction stories; an influential early example is the 1951 film adaption of When Worlds Collide.[71] Usually the extinction threat is narrowly avoided, but some exceptions exist, such as R.U.R. and Steven Spielberg's A.I.[72]

See also edit

References edit

  1. ^ Di Mardi (October 15, 2020). "The grim fate that could be 'worse than extinction'". BBC News. Retrieved November 11, 2020. When we think of existential risks, events like nuclear war or asteroid impacts often come to mind.
  2. ^ a b c d e Snyder-Beattie, Andrew E.; Ord, Toby; Bonsall, Michael B. (July 30, 2019). "An upper bound for the background rate of human extinction". Scientific Reports. 9 (1): 11054. Bibcode:2019NatSR...911054S. doi:10.1038/s41598-019-47540-7. ISSN 2045-2322. PMC 6667434. PMID 31363134.
  3. ^ a b c d e Bostrom 2013.
  4. ^ a b c d e f g h i j Moynihan, Thomas (September 23, 2020). "How Humanity Came To Contemplate Its Possible Extinction: A Timeline". The MIT Press Reader. Retrieved October 11, 2020.
    See also:
    • Moynihan, Thomas (February 2020). "Existential risk and human extinction: An intellectual history". Futures. 116: 102495. doi:10.1016/j.futures.2019.102495. ISSN 0016-3287. S2CID 213388167.
    • Moynihan, Thomas (2020). X-Risk: How Humanity Discovered Its Own Extinction. MIT Press. ISBN 978-1-913029-82-1.
  5. ^ a b Darwin, Charles; Costa, James T. (2009). The Annotated Origin. Harvard University Press. p. 121. ISBN 978-0674032811.
  6. ^ Moll, S. (2010). The Arch-heretic Marcion. Wissenschaftliche Untersuchungen zum Neuen Testament. Mohr Siebeck. p. 132. ISBN 978-3-16-150268-2. Retrieved June 11, 2023.
  7. ^ Welchman, A. (2014). Politics of Religion/Religions of Politics. Sophia Studies in Cross-cultural Philosophy of Traditions and Cultures. Springer Netherlands. p. 21. ISBN 978-94-017-9448-0. Retrieved June 11, 2023.
  8. ^ Moynihan, T. (2020). X-Risk: How Humanity Discovered Its Own Extinction. MIT Press. p. 56. ISBN 978-1-913029-84-5. Retrieved October 19, 2022.
  9. ^ a b Raup, David M. (1995). "The Role of Extinction in Evolution". In Fitch, W. M.; Ayala, F. J. (eds.). Tempo And Mode in Evolution: Genetics And Paleontology 50 Years After Simpson. National Academies Press (US).
  10. ^ Russell, Bertrand (1945). . Archived from the original on August 7, 2020.
  11. ^ Erskine, Hazel Gaudet (1963). "The Polls: Atomic Weapons and Nuclear Energy". The Public Opinion Quarterly. 27 (2): 155–190. doi:10.1086/267159. JSTOR 2746913.
  12. ^ Sagan, Carl (January 28, 2009). "Nuclear War and Climatic Catastrophe: Some Policy Implications". doi:10.2307/20041818. JSTOR 20041818. Retrieved August 11, 2021.
  13. ^ a b Reese, Martin (2003). Our Final Hour: A Scientist's Warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future In This Century - On Earth and Beyond. Basic Books. ISBN 0-465-06863-4.
  14. ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global catastrophic risks. Oxford University Press. ISBN 978-0199606504.
  15. ^ Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. New York: Hachette. ISBN 9780316484916. This is an equivalent, though crisper statement of Nick Bostrom's definition: "An existential risk is one that threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development." Source: Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority". Global Policy. 4:15-31.
  16. ^ a b c d e f g h i j Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. New York: Hachette. ISBN 9780316484916.
  17. ^ Bostrom, Nick; Sandberg, Anders (2008). "Global Catastrophic Risks Survey" (PDF). FHI Technical Report #2008-1. Future of Humanity Institute.
  18. ^ "Frequently Asked Questions". Existential Risk. Future of Humanity Institute. Retrieved July 26, 2013. The great bulk of existential risk in the foreseeable future is anthropogenic; that is, arising from human activity.
  19. ^ a b c d Matheny, Jason Gaverick (2007). (PDF). Risk Analysis. 27 (5): 1335–1344. doi:10.1111/j.1539-6924.2007.00960.x. PMID 18076500. S2CID 14265396. Archived from the original (PDF) on August 27, 2014. Retrieved July 1, 2016.
  20. ^ Asher, D.J.; Bailey, M.E.; Emel'yanenko, V.; Napier, W.M. (2005). "Earth in the cosmic shooting gallery" (PDF). The Observatory. 125: 319–322. Bibcode:2005Obs...125..319A.
  21. ^ a b c Rampino, M.R.; Ambrose, S.H. (2002). (PDF). Icarus. 156 (2): 562–569. Bibcode:2002Icar..156..562R. doi:10.1006/icar.2001.6808. Archived from the original (PDF) on September 24, 2015. Retrieved February 14, 2022.
  22. ^ Yost, Chad L.; Jackson, Lily J.; Stone, Jeffery R.; Cohen, Andrew S. (March 1, 2018). "Subdecadal phytolith and charcoal records from Lake Malawi, East Africa imply minimal effects on human evolution from the ~74 ka Toba supereruption". Journal of Human Evolution. 116: 75–94. doi:10.1016/j.jhevol.2017.11.005. ISSN 0047-2484. PMID 29477183.
  23. ^ Sagan, Carl (1994). Pale Blue Dot. Random House. pp. 305–6. ISBN 0-679-43841-6. Some planetary civilizations see their way through, place limits on what may and what must not be done, and safely pass through the time of perils. Others are not so lucky or so prudent, perish.
  24. ^ Parfit, Derek (2011). On What Matters Vol. 2. Oxford University Press. p. 616. ISBN 9780199681044. We live during the hinge of history ... If we act wisely in the next few centuries, humanity will survive its most dangerous and decisive period.
  25. ^ Rowe, Thomas; Beard, Simon (2018). "Probabilities, methodologies and the evidence base in existential risk assessments" (PDF). Working Paper, Centre for the Study of Existential Risk. Retrieved August 26, 2018.
  26. ^ J. Richard Gott, III (1993). "Implications of the Copernican principle for our future prospects". Nature. 363 (6427): 315–319. Bibcode:1993Natur.363..315G. doi:10.1038/363315a0. S2CID 4252750.
  27. ^ Leslie 1996, p. 146.
  28. ^ Rees, Martin (2004) [2003]. Our Final Century. Arrow Books. p. 9.
  29. ^ Bostrom, Nick; Sandberg, Anders (2008). "Global Catastrophic Risks Survey" (PDF). FHI Technical Report #2008-1. Future of Humanity Institute.
  30. ^ Meyer, Robinson (April 29, 2016). "Human Extinction Isn't That Unlikely". The Atlantic. Boston, Massachusetts: Emerson Collective. Retrieved April 30, 2016.
  31. ^ Grace, Katja; Salvatier, John; Dafoe, Allen; Zhang, Baobao; Evans, Owain (May 3, 2018). "When Will AI Exceed Human Performance? Evidence from AI Experts". arXiv:1705.08807 [cs.AI].
  32. ^ Strick, Katie. "Is the AI apocalypse actually coming? What life could look like if robots take over". The Standard. Retrieved May 31, 2023.
  33. ^ Purtill, Corinne. "How Close Is Humanity to the Edge?". The New Yorker. Retrieved January 8, 2021.
  34. ^ "What are the chances of an AI apocalypse?". The Economist. July 10, 2023. Retrieved July 10, 2023.
  35. ^ "A 30% Chance of AI Catastrophe: Samotsvety's Forecasts on AI Risks and the Impact of a Strong AI Treaty". Treaty on Artificial Intelligence Safety and Cooperation (TAISC). May 1, 2023. Retrieved May 1, 2023.
  36. ^ "Will humans become extinct by 2100?". Metaculus. November 12, 2017. Retrieved November 19, 2023.
  37. ^ Edwards, Lin (June 23, 2010). "Humans will be extinct in 100 years says eminent scientist". Phys.org. Retrieved January 10, 2021.
  38. ^ Nafeez, Ahmed (July 28, 2020). "Theoretical Physicists Say 90% Chance of Societal Collapse Within Several Decades". Vice. Retrieved August 2, 2021.
  39. ^ Bologna, M.; Aquino, G. (2020). "Deforestation and world population sustainability: a quantitative analysis". Scientific Reports. 10 (7631): 7631. arXiv:2006.12202. Bibcode:2020NatSR..10.7631B. doi:10.1038/s41598-020-63657-6. PMC 7203172. PMID 32376879.
  40. ^ Bostrom, Nick (2002), "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards", Journal of Evolution and Technology, vol. 9, My subjective opinion is that setting this probability lower than 25% would be misguided, and the best estimate may be considerably higher.
  41. ^ Whitmire, Daniel P. (August 3, 2017). "Implication of our technological species being first and early". International Journal of Astrobiology. 18 (2): 183–188. doi:10.1017/S1473550417000271.
  42. ^ Leslie 1996, p. 139.
  43. ^ a b Salotti, Jean-Marc (April 2022). "Human extinction by asteroid impact". Futures. 138: 102933. doi:10.1016/j.futures.2022.102933. S2CID 247718308.
  44. ^ "Practical application" page 39 of the Princeton University paper: Philosophical Implications of Inflationary Cosmology May 12, 2005, at the Wayback Machine
  45. ^ Wells, Willard. (2009). Apocalypse when?. Praxis. ISBN 978-0387098364.
  46. ^ Tonn, Bruce; MacGregor, Donald (2009). "A singular chain of events". Futures. 41 (10): 706–714. doi:10.1016/j.futures.2009.07.009. S2CID 144553194. SSRN 1775342.
  47. ^ a b Parfit, Derek (1984). Reasons and Persons. Oxford University Press. pp. 453–454.
  48. ^ MacAskill, William; Yetter Chappell, Richard (2021). "Population Ethics | Practical Implications of Population Ethical Theories". Introduction to Utilitarianism. Retrieved August 12, 2021.
  49. ^ Bostrom, Nick (2009). "Astronomical Waste: The opportunity cost of delayed technological development". Utilitas. 15 (3): 308–314. CiteSeerX 10.1.1.429.2849. doi:10.1017/s0953820800004076. S2CID 15860897.
  50. ^ Todd, Benjamin (2017). "The case for reducing existential risks". 80,000 Hours. Retrieved January 8, 2020.
  51. ^ Sagan, Carl (1983). "Nuclear war and climatic catastrophe: Some policy implications". Foreign Affairs. 62 (2): 257–292. doi:10.2307/20041818. JSTOR 20041818. S2CID 151058846.
  52. ^ Adams, Robert Merrihew (October 1989). "Should Ethics be More Impersonal? a Critical Notice of Derek Parfit, Reasons and Persons". The Philosophical Review. 98 (4): 439–484. doi:10.2307/2185115. JSTOR 2185115.
  53. ^ Bostrom 2013, pp. 23–24.
  54. ^ Parfit, D. (1984) Reasons and Persons. Oxford: Clarendon Press. pp. 453–454
  55. ^ Narveson, Jan (1973). "Moral Problems of Population". The Monist. 57 (1): 62–86. doi:10.5840/monist197357134. PMID 11661014.
  56. ^ Greaves, Hilary (2017). "Discounting for Public Policy: A Survey". Economics & Philosophy. 33 (3): 391–439. doi:10.1017/S0266267117000062. ISSN 0266-2671. S2CID 21730172.
  57. ^ Greaves, Hilary (2017). "Population axiology". Philosophy Compass. 12 (11): e12442. doi:10.1111/phc3.12442. ISSN 1747-9991.
  58. ^ Lewis, Gregory (May 23, 2018). "The person-affecting value of existential risk reduction". www.gregoryjlewis.com. Retrieved August 7, 2020.
  59. ^ Sagan, Carl (Winter 1983). "Nuclear War and Climatic Catastrophe: Some Policy Implications". Foreign Affairs. Council on Foreign Relations. doi:10.2307/20041818. JSTOR 20041818. Retrieved August 4, 2020.
  60. ^ Burke, Edmund (1999) [1790]. "Reflections on the Revolution in France" (PDF). In Canavan, Francis (ed.). Select Works of Edmund Burke Volume 2. Liberty Fund. p. 192.
  61. ^ Weitzman, Martin (2009). "On modeling and interpreting the economics of catastrophic climate change" (PDF). The Review of Economics and Statistics. 91 (1): 1–19. doi:10.1162/rest.91.1.1. S2CID 216093786.
  62. ^ Posner, Richard (2004). Catastrophe: Risk and Response. Oxford University Press.
  63. ^ Benatar, David (2008). Better Never to Have Been: The Harm of Coming into Existence. Oxford University Press. p. 28. ISBN 978-0199549269. Being brought into existence is not a benefit but always a harm.
  64. ^ Benatar, David (2008). Better Never to Have Been: The Harm of Coming into Existence. Oxford University Press. p. 224. ISBN 978-0199549269. Although there are many non-human species - especially carnivores - that also cause a lot of suffering, humans have the unfortunate distinction of being the most destructive and harmful species on earth. The amount of suffering in the world could be radically reduced if there were no more humans.
  65. ^ Best, Steven (2014). "Conclusion: Reflections on Activism and Hope in a Dying World and Suicidal Culture". The Politics of Total Liberation: Revolution for the 21st Century. Palgrave Macmillan. p. 165. doi:10.1057/9781137440723_7. ISBN 978-1137471116. In an era of catastrophe and crisis, the continuation of the human species in a viable or desirable form, is obviously contingent and not a given or necessary good. But considered from the standpoint of animals and the earth, the demise of humanity would be the best imaginable event possible, and the sooner the better. The extinction of Homo sapiens would remove the malignancy ravaging the planet, destroy a parasite consuming its host, shut down the killing machines, and allow the earth to regenerate while permitting new species to evolve.
  66. ^ May, Todd (December 17, 2018). "Would Human Extinction Be a Tragedy?". The New York Times. Human beings are destroying large parts of the inhabitable earth and causing unimaginable suffering to many of the animals that inhabit it. This is happening through at least three means. First, human contribution to climate change is devastating ecosystems . . . Second, the increasing human population is encroaching on ecosystems that would otherwise be intact. Third, factory farming fosters the creation of millions upon millions of animals for whom it offers nothing but suffering and misery before slaughtering them in often barbaric ways. There is no reason to think that those practices are going to diminish any time soon. Quite the opposite.
  67. ^ MacCormack, Patricia (2020). The Ahuman Manifesto: Activism for the End of the Anthropocene. Bloomsbury Academic. pp. 143, 166. ISBN 978-1350081093.
  68. ^ Wagar, W. Warren (2003). "Review of The Last Man, Jean-Baptiste François Xavier Cousin de Grainville". Utopian Studies. 14 (1): 178–180. ISSN 1045-991X. JSTOR 20718566.
  69. ^ "He imagines a world without people. But why?". The Boston Globe. August 18, 2007. Retrieved July 20, 2016.
  70. ^ Tucker, Neely (March 8, 2008). "Depopulation Boom". The Washington Post. Retrieved July 20, 2016.
  71. ^ Barcella, Laura (2012). The end: 50 apocalyptic visions from pop culture that you should know about -- before it's too late. San Francisco, CA: Zest Books. ISBN 978-0982732250.
  72. ^ Dinello, Daniel (2005). Technophobia!: science fiction visions of posthuman technology (1st ed.). Austin: University of Texas press. ISBN 978-0-292-70986-7.

Sources edit

Further reading edit

  • Boulter, Michael (2005). Extinction: Evolution and the End of Man. Columbia University Press. ISBN 978-0231128377.
  • de Bellaigue, Christopher, "A World Off the Hinges" (review of Peter Frankopan, The Earth Transformed: An Untold History, Knopf, 2023, 695 pp.), The New York Review of Books, vol. LXX, no. 18 (23 November 2023), pp. 40–42. De Bellaigue writes: "Like the Maya and the Akkadians we have learned that a broken environment aggravates political and economic dysfunction and that the inverse is also true. Like the Qing we rue the deterioration of our soils. But the lesson is never learned. [...] Denialism [...] is one of the most fundamental of human traits and helps explain our current inability to come up with a response commensurate with the perils we face." (p. 41.)
  • Holt, Jim, "The Power of Catastrophic Thinking" (review of Toby Ord, The Precipice: Existential Risk and the Future of Humanity, Hachette, 2020, 468 pp.), The New York Review of Books, vol. LXVIII, no. 3 (February 25, 2021), pp. 26–29. Jim Holt writes (p. 28): "Whether you are searching for a cure for cancer, or pursuing a scholarly or artistic career, or engaged in establishing more just institutions, a threat to the future of humanity is also a threat to the significance of what you do."
  • MacCormack, Patricia (2020). "Embracing Death, Opening the World". Australian Feminist Studies. 35 (104): 101–115. doi:10.1080/08164649.2020.1791689. S2CID 221790005.
  • Michael Moyer (September 2010). "Eternal Fascinations with the End: Why We're Suckers for Stories of Our Own Demise: Our pattern-seeking brains and desire to be special help explain our fears of the apocalypse". Scientific American.
  • Schubert, Stefan; Caviola, Lucius; Faber, Nadira S. (2019). "The Psychology of Existential Risk: Moral Judgments about Human Extinction". Scientific Reports. 9 (1): 15100. Bibcode:2019NatSR...915100S. doi:10.1038/s41598-019-50145-9. PMC 6803761. PMID 31636277.
  • Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. Bloomsbury Publishing. ISBN 1526600218
  • Torres, Phil. (2017). Morality, Foresight, and Human Flourishing: An Introduction to Existential Risks. Pitchstone Publishing. ISBN 978-1634311427.
  • Michel Weber, "Book Review: Walking Away from Empire", Cosmos and History: The Journal of Natural and Social Philosophy, vol. 10, no. 2, 2014, pp. 329–336.
  • What would happen to Earth if humans went extinct? Live Science, August 16, 2020.
  • A.I. poses human extinction risk on par with nuclear war, Sam Altman and other tech leaders warn. CNBC. May 31, 2023.
  • "Treading Thin Air: Geoff Mann on Uncertainty and Climate Change", London Review of Books, vol. 45, no. 17 (7 September 2023), pp. 17–19. "[W]e are in desperate need of a politics that looks [the] catastrophic uncertainty [of global warming and climate change] square in the face. That would mean taking much bigger and more transformative steps: all but eliminating fossil fuels... and prioritizing democratic institutions over markets. The burden of this effort must fall almost entirely on the richest people and richest parts of the world, because it is they who continue to gamble with everyone else's fate." (p. 19.)

human, extinction, omnicide, redirects, here, other, uses, omnicide, disambiguation, methodological, challenges, quantifying, mitigating, risk, proposed, mitigation, measures, related, organizations, global, catastrophic, risk, hypothetical, human, species, ei. Omnicide redirects here For other uses see Omnicide disambiguation For methodological challenges quantifying and mitigating the risk proposed mitigation measures and related organizations see Global catastrophic risk Human extinction is the hypothetical end of the human species either by population decline due to extraneous natural causes such as an asteroid impact or large scale volcanism or via anthropogenic destruction self extinction for example by sub replacement fertility Nuclear war is an often predicted cause of the extinction of mankind 1 Some of the many possible contributors to anthropogenic hazard are climate change global nuclear annihilation biological warfare weapons of mass destruction and ecological collapse Other scenarios center on emerging technologies such as advanced artificial intelligence biotechnology or self replicating nanobots The scientific consensus is that there is a relatively low risk of near term human extinction due to natural causes 2 3 The likelihood of human extinction through mankind s own activities however is a current area of research and debate Contents 1 History of thought 1 1 Early history of thinking about human extinction 1 2 Atomic era 1 3 Post Cold War 2 Causes 3 Probability 3 1 Natural vs anthropogenic 3 2 Risk estimates 3 3 Individual vs species risks 3 4 Difficulty 4 Ethics 4 1 Value of human life 4 2 Voluntary extinction 5 In fiction 6 See also 7 References 8 Sources 9 Further readingHistory of thought editEarly history of thinking about human extinction edit Before the 18th and 19th centuries the possibility that humans or other organisms could become extinct was viewed with scepticism 4 It contradicted the principle of plenitude a doctrine that all possible things exist 4 The principle traces back to Aristotle and was an important tenet of Christian theology 5 Ancient philosophers such as Plato Aristotle and Lucretius wrote of the end of mankind only as part of a cycle of renewal Marcion of Sinope was a proto protestant who advocated for antinatalism that could lead to human extinction 6 7 Later philosophers such as Al Ghazali William of Ockham and Gerolamo Cardano expanded the study of logic and probability and began wondering if abstract worlds existed including a world without humans Physicist Edmond Halley stated that the extinction of the human race may be beneficial to the future of the world 8 The notion that species can become extinct gained scientific acceptance during the Age of Enlightenment in the 17th and 18th centuries and by 1800 Georges Cuvier had identified 23 extinct prehistoric species 4 The doctrine was further gradually undermined by evidence from the natural sciences particularly the discovery of fossil evidence of species that appeared to no longer exist and the development of theories of evolution 5 In On the Origin of Species Darwin discussed the extinction of species as a natural process and a core component of natural selection 9 Notably Darwin was skeptical of the possibility of sudden extinction viewing it as a gradual process He held that the abrupt disappearances of species from the fossil record were not evidence of catastrophic extinctions but rather represented unrecognised gaps clarification needed in the record 9 As the possibility of extinction became more widely established in the sciences so did the prospect of human extinction 4 In the 19th century human extinction became a popular topic in science e g Thomas Robert Malthus s An Essay on the Principle of Population and fiction e g Jean Baptiste Cousin de Grainville s The Last Man In 1863 a few years after Charles Darwin published On the Origin of Species William King proposed that Neanderthals were an extinct species of the genus Homo The Romantic authors and poets were particularly interested in the topic 4 Lord Byron wrote about the extinction of life on Earth in his 1816 poem Darkness and in 1824 envisaged humanity being threatened by a comet impact and employing a missile system to defend against it 4 Mary Shelley s 1826 novel The Last Man is set in a world where humanity has been nearly destroyed by a mysterious plague 4 At the turn of the 20th century Russian cosmism a precursor to modern transhumanism advocated avoiding humanity s extinction by colonizing space 4 Atomic era edit nbsp Castle Romeo nuclear test on Bikini AtollThe invention of the atomic bomb prompted a wave of discussion among scientists intellectuals and the public at large about the risk of human extinction 4 In a 1945 essay Bertrand Russell wrote that T he prospect for the human race is sombre beyond all precedent Mankind are faced with a clear cut alternative either we shall all perish or we shall have to acquire some slight degree of common sense 10 In 1950 Leo Szilard suggested it was technologically feasible to build a cobalt bomb that could render the planet unlivable A 1950 Gallup poll found that 19 of Americans believed that another world war would mean an end to mankind 11 Rachel Carson s 1962 book Silent Spring raised awareness of environmental catastrophe In 1983 Brandon Carter proposed the Doomsday argument which used Bayesian probability to predict the total number of humans that will ever exist The discovery of nuclear winter in the early 1980s a specific mechanism by which nuclear war could result in human extinction again raised the issue to prominence Writing about these findings in 1983 Carl Sagan argued that measuring the severity of extinction solely in terms of those who die conceals its full impact and that nuclear war imperils all of our descendants for as long as there will be humans 12 Post Cold War edit John Leslie s 1996 book The End of The World was an academic treatment of the science and ethics of human extinction In it Leslie considered a range of threats to humanity and what they have in common In 2003 British Astronomer Royal Sir Martin Rees published Our Final Hour in which he argues that advances in certain technologies create new threats to the survival of mankind and that the 21st century may be a critical moment in history when humanity s fate is decided 13 Edited by Nick Bostrom and Milan M Cirkovic Global Catastrophic Risks was published in 2008 a collection of essays from 26 academics on various global catastrophic and existential risks 14 Toby Ord s 2020 book The Precipice Existential Risk and the Future of Humanity argues that preventing existential risks is one of the most important moral issues of our time The book discusses quantifies and compares different existential risks concluding that the greatest risks are presented by unaligned artificial intelligence and biotechnology 15 Causes editMain article Global catastrophe scenarios Potential anthropogenic causes of human extinction include global thermonuclear war deployment of a highly effective biological weapon an ecological collapse runaway artificial intelligence runaway nanotechnology such as a grey goo scenario a scientific accident involving a micro black hole or vacuum metastability disaster overpopulation and increased consumption pose the risk of resource depletion and a concomitant population crash population decline by choosing to have fewer children displacement of naturally evolved humans by a new species produced by genetic engineering or technological augmentation Natural and external extinction risks include high fatality rate pandemic supervolcanic eruption asteroid impact nearby supernova or gamma ray bursts extreme solar flare or alien invasion Humans e g Homo sapiens sapiens as a species may also be considered to have gone extinct simply by being replaced with distant descendants whose continued evolution may produce new species or subspecies Homo or of hominids Without intervention by unexpected forces the stellar evolution of the Sun is expected to make Earth uninhabitable then destroy it Depending on its ultimate fate the entire universe may eventually become uninhabitable Probability editNatural vs anthropogenic edit Experts generally agree that anthropogenic existential risks are much more likely than natural risks 16 13 17 2 18 A key difference between these risk types is that empirical evidence can place an upper bound on the level of natural risk 2 Humanity has existed for at least 200 000 years over which it has been subject to a roughly constant level of natural risk If the natural risk were sufficiently high then it would be highly unlikely that humanity would have survived as long as it has Based on a formalization of this argument researchers have concluded that we can be confident that natural risk is lower than 1 in 14 000 per year equivalent to 1 in 140 per century on average 2 Another empirical method to study the likelihood of certain natural risks is to investigate the geological record 16 For example a comet or asteroid impact event sufficient in scale to cause an impact winter that would cause human extinction before the year 2100 has been estimated at one in a million 19 20 Moreover large supervolcano eruptions may cause a volcanic winter that could endanger the survival of humanity 21 The geological record suggests that supervolcanic eruptions are estimated to occur on average about once every 50 000 years though most such eruptions would not reach the scale required to cause human extinction 21 Famously the supervolcano Mt Toba may have almost wiped out humanity at the time of its last eruption though this is contentious 21 22 Since anthropogenic risk is a relatively recent phenomenon humanity s track record of survival cannot provide similar assurances 2 Humanity has only survived 78 years since the creation of nuclear weapons and for future technologies there is no track record at all This has led thinkers like Carl Sagan to conclude that humanity is currently in a time of perils 23 a uniquely dangerous period in human history where it is subject to unprecedented levels of risk beginning from when humans first started posing risk to themselves through their actions 16 24 Risk estimates edit Given the limitations of ordinary observation and modeling expert elicitation is frequently used instead to obtain probability estimates 25 Humanity has a 95 probability of being extinct in 7 800 000 years according to J Richard Gott s formulation of the controversial doomsday argument which argues that we have probably already lived through half the duration of human history 26 In 1996 John A Leslie estimated a 30 risk over the next five centuries equivalent to around 6 per century on average 27 In 2003 Martin Rees estimated a 50 chance of collapse of civilisation in the twenty first century 28 A 2008 survey by the Future of Humanity Institute estimated a 5 probability of extinction by super intelligence by 2100 29 The Global Challenges Foundation s 2016 annual report estimates an annual probability of human extinction of at least 0 05 per year equivalent to 5 per century on average 30 A 2016 survey of AI experts found a median estimate of 5 that human level AI would cause an outcome that was extremely bad e g human extinction 31 In 2019 the risk was lowered to 2 but in 2022 it was increased back to 5 In 2023 the risk doubled to 10 32 In 2020 Toby Ord estimates existential risk in the next century at 1 in 6 in his book The Precipice Existential Risk and the Future of Humanity 16 33 He also estimated a 1 in 10 risk of extinction by unaligned AI within the next century According to the July 10 2023 article of The Economist scientists estimated a 12 chance of AI caused catastrophe and a 3 chance of AI caused extinction by 2100 They also estimate a 8 chance of Nuclear War causing global catastrophe and a 0 5625 chance of Nuclear War causing Human Extinction 34 In May 1 2023 The Treaty on Artificial Intelligence Safety and Cooperation TAISC has estimated a 30 5 risk of an AI caused catastrophe by 2200 although they also estimate a 32 2 risk of an AI caused catastrophe by 2026 if there is no 6 month moratorium 35 As of November 19 2023 Metaculus users estimate a 1 probability of human extinction by 2100 36 In a 2010 interview with The Australian Australian scientist Frank Fenner predicted the extinction of the human race within a century primarily as the result of human overpopulation environmental degradation and climate change 37 According to a 2020 study published in Scientific Reports if deforestation and resource consumption continue at current rates they could culminate in a catastrophic collapse in human population and possibly an irreversible collapse of our civilization in the next 20 to 40 years According to the most optimistic scenario provided by the study the chances that human civilization survives are smaller than 10 To avoid this collapse the study says humanity should pass from a civilization dominated by the economy to a cultural society that privileges the interest of the ecosystem above the individual interest of its components but eventually in accordance with the overall communal interest 38 39 Nick Bostrom a philosopher at the University of Oxford known for his work on existential risk argues that it would be misguided 40 to assume that the probability of near term extinction is less than 25 and that it will be a tall order for the human race to get our precautions sufficiently right the first time given that an existential risk provides no opportunity to learn from failure 3 19 Philosopher John A Leslie assigns a 70 chance of humanity surviving the next five centuries based partly on the controversial philosophical doomsday argument that Leslie champions Leslie s argument is somewhat frequentist based on the observation that human extinction has never been observed but requires subjective anthropic arguments 41 Leslie also discusses the anthropic survivorship bias which he calls an observational selection effect on page 139 and states that the a priori certainty of observing an undisastrous past could make it difficult to argue that we must be safe because nothing terrible has yet occurred He quotes Holger Bech Nielsen s formulation We do not even know if there should exist some extremely dangerous decay of say the proton which caused the eradication of the earth because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe 42 Jean Marc Salotti calculated the probability of human extinction caused by a giant asteroid impact 43 It is between 0 03 and 0 3 for the next billion years if there is no colonization of other planets According to that study the most frightening object is a giant long period comet with a warning time of a few years only and therefore no time for any intervention in space or settlement on the Moon or Mars The probability of a giant comet impact in the next hundred years is 2 2E 12 43 Further information Nuclear holocaust Likelihood of complete human extinction Individual vs species risks edit Although existential risks are less manageable by individuals than for example health risks according to Ken Olum Joshua Knobe and Alexander Vilenkin the possibility of human extinction does have practical implications For instance if the universal doomsday argument is accepted it changes the most likely source of disasters and hence the most efficient means of preventing them They write you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one You should not worry especially about the chance that some specific nearby star will become a supernova but more about the chance that supernovas are more deadly to nearby life than we believe 44 Difficulty edit Some scholars argue that certain scenarios such as global thermonuclear war would have difficulty eradicating every last settlement on Earth Physicist Willard Wells points out that any credible extinction scenario would have to reach into a diverse set of areas including the underground subways of major cities the mountains of Tibet the remotest islands of the South Pacific and even to McMurdo Station in Antarctica which has contingency plans and supplies for long isolation 45 In addition elaborate bunkers exist for government leaders to occupy during a nuclear war 19 The existence of nuclear submarines which can stay hundreds of meters deep in the ocean for potentially years at a time should also be considered Any number of events could lead to a massive loss of human life but if the last few see minimum viable population most resilient humans are unlikely to also die off then that particular human extinction scenario may not seem credible 46 Ethics editValue of human life edit nbsp Placard against omnicide at Extinction Rebellion 2018 Existential risks are risks that threaten the entire future of humanity whether by causing human extinction or by otherwise permanently crippling human progress 3 Multiple scholars have argued based on the size of the cosmic endowment that because of the inconceivably large number of potential future lives that are at stake even small reductions of existential risk have great value In one of the earliest discussions of ethics of human extinction Derek Parfit offers the following thought experiment 47 I believe that if we destroy mankind as we now can this outcome will be much worse than most people think Compare three outcomes 1 Peace 2 A nuclear war that kills 99 of the world s existing population 3 A nuclear war that kills 100 2 would be worse than 1 and 3 would be worse than 2 Which is the greater of these two differences Most people believe that the greater difference is between 1 and 2 I believe that the difference between 2 and 3 is very much greater Derek Parfit The scale of what is lost in an existential catastrophe is determined by humanity s long term potential what humanity could expect to achieve if it survived 16 From a utilitarian perspective the value of protecting humanity is the product of its duration how long humanity survives its size how many humans there are over time and its quality on average how good is life for future people 16 273 48 On average species survive for around a million years before going extinct Parfit points out that the Earth will remain habitable for around a billion years 47 And these might be lower bounds on our potential if humanity is able to expand beyond Earth it could greatly increase the human population and survive for trillions of years 49 16 21 The size of the foregone potential that would be lost were humanity to become extinct is very large Therefore reducing existential risk by even a small amount would have a very significant moral value 3 50 Carl Sagan wrote in 1983 If we are required to calibrate extinction in numerical terms I would be sure to include the number of people in future generations who would not be born By one calculation the stakes are one million times greater for extinction than for the more modest nuclear wars that kill only hundreds of millions of people There are many other possible measures of the potential loss including culture and science the evolutionary history of the planet and the significance of the lives of all of our ancestors who contributed to the future of their descendants Extinction is the undoing of the human enterprise 51 Philosopher Robert Adams in 1989 rejects Parfit s impersonal views but speaks instead of a moral imperative for loyalty and commitment to the future of humanity as a vast project The aspiration for a better society more just more rewarding and more peaceful our interest in the lives of our children and grandchildren and the hopes that they will be able in turn to have the lives of their children and grandchildren as projects 52 Philosopher Nick Bostrom argues in 2013 that preference satisfactionist democratic custodial and intuitionist arguments all converge on the common sense view that preventing existential risk is a high moral priority even if the exact degree of badness of human extinction varies between these philosophies 53 Parfit argues that the size of the cosmic endowment can be calculated from the following argument If Earth remains habitable for a billion more years and can sustainably support a population of more than a billion humans then there is a potential for 1016 or 10 000 000 000 000 000 human lives of normal duration 54 Bostrom goes further stating that if the universe is empty then the accessible universe can support at least 1034 biological human life years and if some humans were uploaded onto computers could even support the equivalent of 1054 cybernetic human life years 3 Some economists and philosophers have defended views including exponential discounting and person affecting views of population ethics on which future people do not matter or matter much less morally speaking 55 While these views are controversial 19 56 57 even they would agree that an existential catastrophe would be among the worst things imaginable It would cut short the lives of eight billion presently existing people destroying all of what makes their lives valuable and most likely subjecting many of them to profound suffering So even setting aside the value of future generations there may be strong reasons to reduce existential risk grounded in concern for presently existing people 58 Beyond utilitarianism other moral perspectives lend support to the importance of reducing existential risk An existential catastrophe would destroy more than just humanity it would destroy all cultural artifacts languages and traditions and many of the things we value 16 59 So moral viewpoints on which we have duties to protect and cherish things of value would see this as a huge loss that should be avoided 16 One can also consider reasons grounded in duties to past generations For instance Edmund Burke writes of a partnership between those who are living those who are dead and those who are to be born 60 If one takes seriously the debt humanity owes to past generations Ord argues the best way of repaying it might be to pay it forward and ensure that humanity s inheritance is passed down to future generations 16 49 51 There are several economists who have discussed the importance of global catastrophic risks For example Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid range expectations resulting in catastrophic damage 61 Richard Posner has argued that humanity is doing far too little in general about small hard to estimate risks of large scale catastrophes 62 Voluntary extinction edit nbsp Voluntary Human Extinction MovementSome philosophers adopt the antinatalist position that human extinction would not be a bad thing but a good thing David Benatar argues that coming into existence is always serious harm and therefore it is better that people do not come into existence in the future 63 Further David Benatar animal rights activist Steven Best and anarchist Todd May posit that human extinction would be a positive thing for the other organisms on the planet and the planet itself citing for example the omnicidal nature of human civilization 64 65 66 The environmental view in favor of human extinction is shared by the members of Voluntary Human Extinction Movement and the Church of Euthanasia who call for refraining from reproduction and allowing the human species to go peacefully extinct thus stopping further environmental degradation 67 In fiction editMain article Apocalyptic and post apocalyptic fiction Jean Baptiste Cousin de Grainville s 1805 science fantasy novel Le dernier homme The Last Man which depicts human extinction due to infertility is considered the first modern apocalyptic novel and credited with launching the genre 68 Other notable early works include Mary Shelley s 1826 The Last Man depicting human extinction caused by a pandemic and Olaf Stapledon s 1937 Star Maker a comparative study of omnicide 4 Some 21st century pop science works including The World Without Us by Alan Weisman and the television specials Life After People and Aftermath Population Zero pose a thought experiment what would happen to the rest of the planet if humans suddenly disappeared 69 70 A threat of human extinction such as through a technological singularity also called an intelligence explosion drives the plot of innumerable science fiction stories an influential early example is the 1951 film adaption of When Worlds Collide 71 Usually the extinction threat is narrowly avoided but some exceptions exist such as R U R and Steven Spielberg s A I 72 See also editSocietal collapse Eschatology Extinction event Extinction Rebellion Global catastrophic risk Great Filter Holocene extinction Speculative evolution Voluntary Human Extinction Movement World War IIIReferences edit Di Mardi October 15 2020 The grim fate that could be worse than extinction BBC News Retrieved November 11 2020 When we think of existential risks events like nuclear war or asteroid impacts often come to mind a b c d e Snyder Beattie Andrew E Ord Toby Bonsall Michael B July 30 2019 An upper bound for the background rate of human extinction Scientific Reports 9 1 11054 Bibcode 2019NatSR 911054S doi 10 1038 s41598 019 47540 7 ISSN 2045 2322 PMC 6667434 PMID 31363134 a b c d e Bostrom 2013 a b c d e f g h i j Moynihan Thomas September 23 2020 How Humanity Came To Contemplate Its Possible Extinction A Timeline The MIT Press Reader Retrieved October 11 2020 See also Moynihan Thomas February 2020 Existential risk and human extinction An intellectual history Futures 116 102495 doi 10 1016 j futures 2019 102495 ISSN 0016 3287 S2CID 213388167 Moynihan Thomas 2020 X Risk How Humanity Discovered Its Own Extinction MIT Press ISBN 978 1 913029 82 1 a b Darwin Charles Costa James T 2009 The Annotated Origin Harvard University Press p 121 ISBN 978 0674032811 Moll S 2010 The Arch heretic Marcion Wissenschaftliche Untersuchungen zum Neuen Testament Mohr Siebeck p 132 ISBN 978 3 16 150268 2 Retrieved June 11 2023 Welchman A 2014 Politics of Religion Religions of Politics Sophia Studies in Cross cultural Philosophy of Traditions and Cultures Springer Netherlands p 21 ISBN 978 94 017 9448 0 Retrieved June 11 2023 Moynihan T 2020 X Risk How Humanity Discovered Its Own Extinction MIT Press p 56 ISBN 978 1 913029 84 5 Retrieved October 19 2022 a b Raup David M 1995 The Role of Extinction in Evolution In Fitch W M Ayala F J eds Tempo And Mode in Evolution Genetics And Paleontology 50 Years After Simpson National Academies Press US Russell Bertrand 1945 The Bomb and Civilization Archived from the original on August 7 2020 Erskine Hazel Gaudet 1963 The Polls Atomic Weapons and Nuclear Energy The Public Opinion Quarterly 27 2 155 190 doi 10 1086 267159 JSTOR 2746913 Sagan Carl January 28 2009 Nuclear War and Climatic Catastrophe Some Policy Implications doi 10 2307 20041818 JSTOR 20041818 Retrieved August 11 2021 a b Reese Martin 2003 Our Final Hour A Scientist s Warning How Terror Error and Environmental Disaster Threaten Humankind s Future In This Century On Earth and Beyond Basic Books ISBN 0 465 06863 4 Bostrom Nick Cirkovic Milan M eds 2008 Global catastrophic risks Oxford University Press ISBN 978 0199606504 Ord Toby 2020 The Precipice Existential Risk and the Future of Humanity New York Hachette ISBN 9780316484916 This is an equivalent though crisper statement of Nick Bostrom s definition An existential risk is one that threatens the premature extinction of Earth originating intelligent life or the permanent and drastic destruction of its potential for desirable future development Source Bostrom Nick 2013 Existential Risk Prevention as Global Priority Global Policy 4 15 31 a b c d e f g h i j Ord Toby 2020 The Precipice Existential Risk and the Future of Humanity New York Hachette ISBN 9780316484916 Bostrom Nick Sandberg Anders 2008 Global Catastrophic Risks Survey PDF FHI Technical Report 2008 1 Future of Humanity Institute Frequently Asked Questions Existential Risk Future of Humanity Institute Retrieved July 26 2013 The great bulk of existential risk in the foreseeable future is anthropogenic that is arising from human activity a b c d Matheny Jason Gaverick 2007 Reducing the Risk of Human Extinction PDF Risk Analysis 27 5 1335 1344 doi 10 1111 j 1539 6924 2007 00960 x PMID 18076500 S2CID 14265396 Archived from the original PDF on August 27 2014 Retrieved July 1 2016 Asher D J Bailey M E Emel yanenko V Napier W M 2005 Earth in the cosmic shooting gallery PDF The Observatory 125 319 322 Bibcode 2005Obs 125 319A a b c Rampino M R Ambrose S H 2002 Super eruptions as a threat to civilizations on Earth like planets PDF Icarus 156 2 562 569 Bibcode 2002Icar 156 562R doi 10 1006 icar 2001 6808 Archived from the original PDF on September 24 2015 Retrieved February 14 2022 Yost Chad L Jackson Lily J Stone Jeffery R Cohen Andrew S March 1 2018 Subdecadal phytolith and charcoal records from Lake Malawi East Africa imply minimal effects on human evolution from the 74 ka Toba supereruption Journal of Human Evolution 116 75 94 doi 10 1016 j jhevol 2017 11 005 ISSN 0047 2484 PMID 29477183 Sagan Carl 1994 Pale Blue Dot Random House pp 305 6 ISBN 0 679 43841 6 Some planetary civilizations see their way through place limits on what may and what must not be done and safely pass through the time of perils Others are not so lucky or so prudent perish Parfit Derek 2011 On What Matters Vol 2 Oxford University Press p 616 ISBN 9780199681044 We live during the hinge of history If we act wisely in the next few centuries humanity will survive its most dangerous and decisive period Rowe Thomas Beard Simon 2018 Probabilities methodologies and the evidence base in existential risk assessments PDF Working Paper Centre for the Study of Existential Risk Retrieved August 26 2018 J Richard Gott III 1993 Implications of the Copernican principle for our future prospects Nature 363 6427 315 319 Bibcode 1993Natur 363 315G doi 10 1038 363315a0 S2CID 4252750 Leslie 1996 p 146 Rees Martin 2004 2003 Our Final Century Arrow Books p 9 Bostrom Nick Sandberg Anders 2008 Global Catastrophic Risks Survey PDF FHI Technical Report 2008 1 Future of Humanity Institute Meyer Robinson April 29 2016 Human Extinction Isn t That Unlikely The Atlantic Boston Massachusetts Emerson Collective Retrieved April 30 2016 Grace Katja Salvatier John Dafoe Allen Zhang Baobao Evans Owain May 3 2018 When Will AI Exceed Human Performance Evidence from AI Experts arXiv 1705 08807 cs AI Strick Katie Is the AI apocalypse actually coming What life could look like if robots take over The Standard Retrieved May 31 2023 Purtill Corinne How Close Is Humanity to the Edge The New Yorker Retrieved January 8 2021 What are the chances of an AI apocalypse The Economist July 10 2023 Retrieved July 10 2023 A 30 Chance of AI Catastrophe Samotsvety s Forecasts on AI Risks and the Impact of a Strong AI Treaty Treaty on Artificial Intelligence Safety and Cooperation TAISC May 1 2023 Retrieved May 1 2023 Will humans become extinct by 2100 Metaculus November 12 2017 Retrieved November 19 2023 Edwards Lin June 23 2010 Humans will be extinct in 100 years says eminent scientist Phys org Retrieved January 10 2021 Nafeez Ahmed July 28 2020 Theoretical Physicists Say 90 Chance of Societal Collapse Within Several Decades Vice Retrieved August 2 2021 Bologna M Aquino G 2020 Deforestation and world population sustainability a quantitative analysis Scientific Reports 10 7631 7631 arXiv 2006 12202 Bibcode 2020NatSR 10 7631B doi 10 1038 s41598 020 63657 6 PMC 7203172 PMID 32376879 Bostrom Nick 2002 Existential Risks Analyzing Human Extinction Scenarios and Related Hazards Journal of Evolution and Technology vol 9 My subjective opinion is that setting this probability lower than 25 would be misguided and the best estimate may be considerably higher Whitmire Daniel P August 3 2017 Implication of our technological species being first and early International Journal of Astrobiology 18 2 183 188 doi 10 1017 S1473550417000271 Leslie 1996 p 139 a b Salotti Jean Marc April 2022 Human extinction by asteroid impact Futures 138 102933 doi 10 1016 j futures 2022 102933 S2CID 247718308 Practical application page 39 of the Princeton University paper Philosophical Implications of Inflationary Cosmology Archived May 12 2005 at the Wayback Machine Wells Willard 2009 Apocalypse when Praxis ISBN 978 0387098364 Tonn Bruce MacGregor Donald 2009 A singular chain of events Futures 41 10 706 714 doi 10 1016 j futures 2009 07 009 S2CID 144553194 SSRN 1775342 a b Parfit Derek 1984 Reasons and Persons Oxford University Press pp 453 454 MacAskill William Yetter Chappell Richard 2021 Population Ethics Practical Implications of Population Ethical Theories Introduction to Utilitarianism Retrieved August 12 2021 Bostrom Nick 2009 Astronomical Waste The opportunity cost of delayed technological development Utilitas 15 3 308 314 CiteSeerX 10 1 1 429 2849 doi 10 1017 s0953820800004076 S2CID 15860897 Todd Benjamin 2017 The case for reducing existential risks 80 000 Hours Retrieved January 8 2020 Sagan Carl 1983 Nuclear war and climatic catastrophe Some policy implications Foreign Affairs 62 2 257 292 doi 10 2307 20041818 JSTOR 20041818 S2CID 151058846 Adams Robert Merrihew October 1989 Should Ethics be More Impersonal a Critical Notice of Derek Parfit Reasons and Persons The Philosophical Review 98 4 439 484 doi 10 2307 2185115 JSTOR 2185115 Bostrom 2013 pp 23 24 Parfit D 1984 Reasons and Persons Oxford Clarendon Press pp 453 454 Narveson Jan 1973 Moral Problems of Population The Monist 57 1 62 86 doi 10 5840 monist197357134 PMID 11661014 Greaves Hilary 2017 Discounting for Public Policy A Survey Economics amp Philosophy 33 3 391 439 doi 10 1017 S0266267117000062 ISSN 0266 2671 S2CID 21730172 Greaves Hilary 2017 Population axiology Philosophy Compass 12 11 e12442 doi 10 1111 phc3 12442 ISSN 1747 9991 Lewis Gregory May 23 2018 The person affecting value of existential risk reduction www gregoryjlewis com Retrieved August 7 2020 Sagan Carl Winter 1983 Nuclear War and Climatic Catastrophe Some Policy Implications Foreign Affairs Council on Foreign Relations doi 10 2307 20041818 JSTOR 20041818 Retrieved August 4 2020 Burke Edmund 1999 1790 Reflections on the Revolution in France PDF In Canavan Francis ed Select Works of Edmund Burke Volume 2 Liberty Fund p 192 Weitzman Martin 2009 On modeling and interpreting the economics of catastrophic climate change PDF The Review of Economics and Statistics 91 1 1 19 doi 10 1162 rest 91 1 1 S2CID 216093786 Posner Richard 2004 Catastrophe Risk and Response Oxford University Press Benatar David 2008 Better Never to Have Been The Harm of Coming into Existence Oxford University Press p 28 ISBN 978 0199549269 Being brought into existence is not a benefit but always a harm Benatar David 2008 Better Never to Have Been The Harm of Coming into Existence Oxford University Press p 224 ISBN 978 0199549269 Although there are many non human species especially carnivores that also cause a lot of suffering humans have the unfortunate distinction of being the most destructive and harmful species on earth The amount of suffering in the world could be radically reduced if there were no more humans Best Steven 2014 Conclusion Reflections on Activism and Hope in a Dying World and Suicidal Culture The Politics of Total Liberation Revolution for the 21st Century Palgrave Macmillan p 165 doi 10 1057 9781137440723 7 ISBN 978 1137471116 In an era of catastrophe and crisis the continuation of the human species in a viable or desirable form is obviously contingent and not a given or necessary good But considered from the standpoint of animals and the earth the demise of humanity would be the best imaginable event possible and the sooner the better The extinction of Homo sapiens would remove the malignancy ravaging the planet destroy a parasite consuming its host shut down the killing machines and allow the earth to regenerate while permitting new species to evolve May Todd December 17 2018 Would Human Extinction Be a Tragedy The New York Times Human beings are destroying large parts of the inhabitable earth and causing unimaginable suffering to many of the animals that inhabit it This is happening through at least three means First human contribution to climate change is devastating ecosystems Second the increasing human population is encroaching on ecosystems that would otherwise be intact Third factory farming fosters the creation of millions upon millions of animals for whom it offers nothing but suffering and misery before slaughtering them in often barbaric ways There is no reason to think that those practices are going to diminish any time soon Quite the opposite MacCormack Patricia 2020 The Ahuman Manifesto Activism for the End of the Anthropocene Bloomsbury Academic pp 143 166 ISBN 978 1350081093 Wagar W Warren 2003 Review of The Last Man Jean Baptiste Francois Xavier Cousin de Grainville Utopian Studies 14 1 178 180 ISSN 1045 991X JSTOR 20718566 He imagines a world without people But why The Boston Globe August 18 2007 Retrieved July 20 2016 Tucker Neely March 8 2008 Depopulation Boom The Washington Post Retrieved July 20 2016 Barcella Laura 2012 The end 50 apocalyptic visions from pop culture that you should know about before it s too late San Francisco CA Zest Books ISBN 978 0982732250 Dinello Daniel 2005 Technophobia science fiction visions of posthuman technology 1st ed Austin University of Texas press ISBN 978 0 292 70986 7 Sources editBostrom Nick 2002 Existential risks analyzing human extinction scenarios and related hazards Journal of Evolution and Technology 9 ISSN 1541 0099 Bostrom Nick Cirkovic Milan M September 29 2011 Orig July 3 2008 1 Introduction In Bostrom Nick Cirkovic Milan M eds Global Catastrophic Risks Oxford University Press pp 1 30 ISBN 9780199606504 OCLC 740989645 Rampino Michael R 10 Super volcanism and other geophysical processes of catastrophic import In Bostrom amp Cirkovic 2011 pp 205 221 Napier William 11 Hazards from comets and asteroids In Bostrom amp Cirkovic 2011 pp 222 237 Dar Arnon 12 Influence of Supernovae gamma ray bursts solar flares and cosmic rays on the terrestrial environment In Bostrom amp Cirkovic 2011 pp 238 262 Frame David Allen Myles R 13 Climate change and global risk In Bostrom amp Cirkovic 2011 pp 265 286 Kilbourne Edwin Dennis 14 Plagues and pandemics past present and future In Bostrom amp Cirkovic 2011 pp 287 304 Yudkowsky Eliezer 15 Artificial Intelligence as a positive and negative factor in global risk In Bostrom amp Cirkovic 2011 pp 308 345 Wilczek Frank 16 Big troubles imagined and real In Bostrom amp Cirkovic 2011 pp 346 362 Cirincione Joseph 18 The continuing threat of nuclear war In Bostrom amp Cirkovic 2011 pp 381 401 Ackerman Gary Potter William C 19 Catastrophic nuclear terrorism a preventable peril In Bostrom amp Cirkovic 2011 pp 402 449 Nouri Ali Chyba Christopher F 20 Biotechnology and biosecurity In Bostrom amp Cirkovic 2011 pp 450 480 Phoenix Chris Treder Mike 21 Nanotechnology as global catastrophic risk In Bostrom amp Cirkovic 2011 pp 481 503 Bostrom Nick 2013 Existential Risk Prevention as Global Priority Global Policy 4 1 15 31 doi 10 1111 1758 5899 12002 ISSN 1758 5899 PDF Leslie John 1996 The End of the World The Science and Ethics of Human Extinction Routledge ISBN 978 0415140430 OCLC 1158823437 Posner Richard A November 11 2004 Catastrophe Risk and Response Oxford University Press ISBN 978 0 19 534639 8 OCLC 224729961 Rees Martin J March 19 2003 Our Final Hour A Scientist s Warning how Terror Error and Environmental Disaster Threaten Humankind s Future in this Century on Earth and Beyond Basic Books ISBN 978 0 465 06862 3 OCLC 51315429 Further reading editBoulter Michael 2005 Extinction Evolution and the End of Man Columbia University Press ISBN 978 0231128377 de Bellaigue Christopher A World Off the Hinges review of Peter Frankopan The Earth Transformed An Untold History Knopf 2023 695 pp The New York Review of Books vol LXX no 18 23 November 2023 pp 40 42 De Bellaigue writes Like the Maya and the Akkadians we have learned that a broken environment aggravates political and economic dysfunction and that the inverse is also true Like the Qing we rue the deterioration of our soils But the lesson is never learned Denialism is one of the most fundamental of human traits and helps explain our current inability to come up with a response commensurate with the perils we face p 41 Holt Jim The Power of Catastrophic Thinking review of Toby Ord The Precipice Existential Risk and the Future of Humanity Hachette 2020 468 pp The New York Review of Books vol LXVIII no 3 February 25 2021 pp 26 29 Jim Holt writes p 28 Whether you are searching for a cure for cancer or pursuing a scholarly or artistic career or engaged in establishing more just institutions a threat to the future of humanity is also a threat to the significance of what you do MacCormack Patricia 2020 Embracing Death Opening the World Australian Feminist Studies 35 104 101 115 doi 10 1080 08164649 2020 1791689 S2CID 221790005 Michael Moyer September 2010 Eternal Fascinations with the End Why We re Suckers for Stories of Our Own Demise Our pattern seeking brains and desire to be special help explain our fears of the apocalypse Scientific American Schubert Stefan Caviola Lucius Faber Nadira S 2019 The Psychology of Existential Risk Moral Judgments about Human Extinction Scientific Reports 9 1 15100 Bibcode 2019NatSR 915100S doi 10 1038 s41598 019 50145 9 PMC 6803761 PMID 31636277 Ord Toby 2020 The Precipice Existential Risk and the Future of Humanity Bloomsbury Publishing ISBN 1526600218 Torres Phil 2017 Morality Foresight and Human Flourishing An Introduction to Existential Risks Pitchstone Publishing ISBN 978 1634311427 Michel Weber Book Review Walking Away from Empire Cosmos and History The Journal of Natural and Social Philosophy vol 10 no 2 2014 pp 329 336 What would happen to Earth if humans went extinct Live Science August 16 2020 A I poses human extinction risk on par with nuclear war Sam Altman and other tech leaders warn CNBC May 31 2023 Treading Thin Air Geoff Mann on Uncertainty and Climate Change London Review of Books vol 45 no 17 7 September 2023 pp 17 19 W e are in desperate need of a politics that looks the catastrophic uncertainty of global warming and climate change square in the face That would mean taking much bigger and more transformative steps all but eliminating fossil fuels and prioritizing democratic institutions over markets The burden of this effort must fall almost entirely on the richest people and richest parts of the world because it is they who continue to gamble with everyone else s fate p 19 Retrieved from https en wikipedia org w index php title Human extinction amp oldid 1194376769 In fiction, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.