fbpx
Wikipedia

Global catastrophic risk

A global catastrophic risk or a doomsday scenario is a hypothetical future event that could damage human well-being on a global scale,[2] even endangering or destroying modern civilization.[3] An event that could cause human extinction or permanently and drastically curtail humanity's potential is known as an "existential risk."[4]

Artist's impression of a major asteroid impact. An asteroid caused the extinction of the non-avian dinosaurs.[1]

Over the last two decades, a number of academic and non-profit organizations have been established to research global catastrophic and existential risks, formulate potential mitigation measures and either advocate for or implement these measures.[5][6][7][8]

Definition and classification

 
Scope/intensity grid from Bostrom's paper "Existential Risk Prevention as Global Priority"[9]

Defining global catastrophic risks

The term global catastrophic risk "lacks a sharp definition", and generally refers (loosely) to a risk that could inflict "serious damage to human well-being on a global scale".[10]

Humanity has suffered large catastrophes before. Some of these have caused serious damage but were only local in scope—e.g. the Black Death may have resulted in the deaths of a third of Europe's population,[11] 10% of the global population at the time.[12] Some were global, but were not as severe—e.g. the 1918 influenza pandemic killed an estimated 3–6% of the world's population.[13] Most global catastrophic risks would not be so intense as to kill the majority of life on earth, but even if one did, the ecosystem and humanity would eventually recover (in contrast to existential risks).

Similarly, in Catastrophe: Risk and Response, Richard Posner singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional", scale. Posner highlights such events as worthy of special attention on cost–benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole.[14]

Defining existential risks

Existential risks are defined as "risks that threaten the destruction of humanity's long-term potential."[15] The instantiation of an existential risk (an existential catastrophe[16]) would either cause outright human extinction or irreversibly lock in a drastically inferior state of affairs.[9][17] Existential risks are a sub-class of global catastrophic risks, where the damage is not only global but also terminal and permanent, preventing recovery and thereby affecting both current and all future generations.[9]

Non-extinction risks

While extinction is the most obvious way in which humanity's long-term potential could be destroyed, there are others, including unrecoverable collapse and unrecoverable dystopia.[18] A disaster severe enough to cause the permanent, irreversible collapse of human civilisation would constitute an existential catastrophe, even if it fell short of extinction.[18] Similarly, if humanity fell under a totalitarian regime, and there were no chance of recovery then such a dystopia would also be an existential catastrophe.[19] Bryan Caplan writes that "perhaps an eternity of totalitarianism would be worse than extinction".[19] (George Orwell's novel Nineteen Eighty-Four suggests[20] an example.[21]) A dystopian scenario shares the key features of extinction and unrecoverable collapse of civilisation—before the catastrophe, humanity faced a vast range of bright futures to choose from; after the catastrophe, humanity is locked forever in a terrible state.[18]

Potential sources of risk

Potential global catastrophic risks are conventionally classified as anthropogenic or non-anthropogenic hazards. Examples of non-anthropogenic risks are an asteroid impact event, a supervolcanic eruption, a lethal gamma-ray burst, a geomagnetic storm destroying electronic equipment, natural long-term climate change, hostile extraterrestrial life, or the predictable Sun transforming into a red giant star engulfing the Earth.

Anthropogenic risks are those caused by humans and include those related to technology, governance, and climate change. Technological risks include the creation of artificial intelligence misaligned with human goals, biotechnology, and nanotechnology. Insufficient or malign global governance creates risks in the social and political domain, such as global war and nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying critical infrastructure like the electrical grid, or the failure to manage a natural or engineered pandemic. Global catastrophic risks in the domain of earth system governance include global warming, environmental degradation, extinction of species, famine as a result of non-equitable resource distribution, human overpopulation, crop failures, and non-sustainable agriculture.

Methodological challenges

Research into the nature and mitigation of global catastrophic risks and existential risks is subject to a unique set of challenges and, as a result, is not easily subjected to the usual standards of scientific rigour.[18] For instance, it is neither feasible nor ethical to study these risks experimentally. Carl Sagan expressed this with regards to nuclear war: "Understanding the long-term consequences of nuclear war is not a problem amenable to experimental verification".[22] Moreover, many catastrophic risks change rapidly as technology advances and background conditions, such as geopolitical conditions, change. Another challenge is the general difficulty of accurately predicting the future over long timescales, especially for anthropogenic risks which depend on complex human political, economic and social systems.[18] In addition to known and tangible risks, unforeseeable black swan extinction events may occur, presenting an additional methodological problem.[18][23]

Lack of historical precedent

Humanity has never suffered an existential catastrophe and if one were to occur, it would necessarily be unprecedented.[18] Therefore, existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects.[24] Unlike with most events, the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has no observers, so regardless of their frequency, no civilization observes existential risks in its history.[24] These anthropic issues may partly be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.[9]

To understand the dynamics of an unprecedented, unrecoverable global civilizational collapse (a type of existential risk), it may be instructive to study the various local civilizational collapses that have occurred throughout human history.[25] For instance, civilizations such as the Roman Empire have ended in a loss of centralized governance and a major civilization-wide loss of infrastructure and advanced technology. However, these examples demonstrate that societies appear to be fairly resilient to catastrophe; for example, Medieval Europe survived the Black Death without suffering anything resembling a civilization collapse despite losing 25 to 50 percent of its population.[26]

Incentives and coordination

There are economic reasons that can explain why so little effort is going into existential risk reduction. It is a global public good, so we should expect it to be undersupplied by markets.[9] Even if a large nation invests in risk mitigation measures, that nation will enjoy only a small fraction of the benefit of doing so. Furthermore, existential risk reduction is an intergenerational global public good, since most of the benefits of existential risk reduction would be enjoyed by future generations, and though these future people would in theory perhaps be willing to pay substantial sums for existential risk reduction, no mechanism for such a transaction exists.[9]

Cognitive biases

Numerous cognitive biases can influence people's judgment of the importance of existential risks, including scope insensitivity, hyperbolic discounting, availability heuristic, the conjunction fallacy, the affect heuristic, and the overconfidence effect.[27]

Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they are willing to give does not increase linearly with the magnitude of the issue: people are roughly as willing to prevent the deaths of 200,000 or 2,000 birds.[28] Similarly, people are often more concerned about threats to individuals than to larger groups.[27]

Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks:[29][30]

Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".

All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the absence of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.[31]

Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth."[32]

Proposed mitigation

Multi-layer defense

Defense in depth is a useful framework for categorizing risk mitigation measures into three layers of defense:[33]

  1. Prevention: Reducing the probability of a catastrophe occurring in the first place. Example: Measures to prevent outbreaks of new highly infectious diseases.
  2. Response: Preventing the scaling of a catastrophe to the global level. Example: Measures to prevent escalation of a small-scale nuclear exchange into an all-out nuclear war.
  3. Resilience: Increasing humanity's resilience (against extinction) when faced with global catastrophes. Example: Measures to increase food security during a nuclear winter.

Human extinction is most likely when all three defenses are weak, that is, "by risks we are unlikely to prevent, unlikely to successfully respond to, and unlikely to be resilient against".[33]

The unprecedented nature of existential risks poses a special challenge in designing risk mitigation measures since humanity will not be able to learn from a track record of previous events.[18]

Funding

Some researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed".[34][35] As of 2020, the Biological Weapons Convention organization had an annual budget of US$1.4 million.[36]

Survival planning

Some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving a global disaster.[37][38][39] Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.[37][40]

Food storage has been proposed globally, but the monetary cost would be high. Furthermore, it would likely contribute to the current millions of deaths per year due to malnutrition.[41] In 2022, a team led by David Denkenberger modeled the cost-effectiveness of resilient foods to artificial general intelligence (AGI) safety and found "∼98-99% confidence" for a higher marginal impact of work on resilient foods.[42] Some survivalists stock survival retreats with multiple-year food supplies.

The Svalbard Global Seed Vault is buried 400 feet (120 m) inside a mountain on an island in the Arctic. It is designed to hold 2.5 billion seeds from more than 100 countries as a precaution to preserve the world's crops. The surrounding rock is −6 °C (21 °F) (as of 2015) but the vault is kept at −18 °C (0 °F) by refrigerators powered by locally sourced coal.[43][44]

More speculatively, if society continues to function and if the biosphere remains habitable, calorie needs for the present human population might in theory be met during an extended absence of sunlight, given sufficient advance planning. Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe, converting cellulose to sugar, or feeding natural gas to methane-digesting bacteria.[45][46]

Global catastrophic risks and global governance

Insufficient global governance creates risks in the social and political domain, but the governance mechanisms develop more slowly than technological and social change. There are concerns from governments, the private sector, as well as the general public about the lack of governance mechanisms to efficiently deal with risks, negotiate and adjudicate between diverse and conflicting interests. This is further underlined by an understanding of the interconnectedness of global systemic risks.[47] In absence or anticipation of global governance, national governments can act individually to better understand, mitigate and prepare for global catastrophes.[48]

Climate emergency plans

In 2018, the Club of Rome called for greater climate change action and published its Climate Emergency Plan, which proposes ten action points to limit global average temperature increase to 1.5 degrees Celsius.[49] Further, in 2019, the Club published the more comprehensive Planetary Emergency Plan.[50]

There is evidence to suggest that collectively engaging with the emotional experiences that emerge during contemplating the vulnerability of the human species within the context of climate change allows for these experiences to be adaptive. When collective engaging with and processing emotional experiences is supportive, this can lead to growth in resilience, psychological flexibility, tolerance of emotional experiences, and community engagement.[51]

Space colonization

Space colonization is a proposed alternative to improve the odds of surviving an extinction scenario.[52] Solutions of this scope may require megascale engineering.

Astrophysicist Stephen Hawking advocated colonizing other planets within the solar system once technology progresses sufficiently, in order to improve the chance of human survival from planet-wide events such as global thermonuclear war.[53][54]

Billionaire Elon Musk writes that humanity must become a multiplanetary species in order to avoid extinction.[55] Musk is using his company SpaceX to develop technology he hopes will be used in the colonization of Mars.

Moving the Earth

In a few billion years, the Sun will expand into a red giant, swallowing the Earth. This can be avoided by moving the Earth farther out from the Sun, keeping the temperature roughly constant. That can be accomplished by tweaking the orbits of comets and asteroids so they pass close to the Earth in such a way that they add energy to the Earth's orbit.[56] Since the Sun's expansion is slow, roughly one such encounter every 6,000 years would suffice.[citation needed]

Skeptics and opponents

Psychologist Steven Pinker has called existential risk a "useless category" that can distract from real threats such as climate change and nuclear war.[34]

Organizations

The Bulletin of the Atomic Scientists (est. 1945) is one of the oldest global risk organizations, founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII. It studies risks associated with nuclear war and energy and famously maintains the Doomsday Clock established in 1947. The Foresight Institute (est. 1986) examines the risks of nanotechnology and its benefits. It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale. It was founded by K. Eric Drexler who postulated "grey goo".[57][58]

Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia.[59]

Independent non-governmental organizations (NGOs) include the Machine Intelligence Research Institute (est. 2000), which aims to reduce the risk of a catastrophe caused by artificial intelligence,[60] with donors including Peter Thiel and Jed McCaleb.[61] The Nuclear Threat Initiative (est. 2001) seeks to reduce global threats from nuclear, biological and chemical threats, and containment of damage after an event.[8] It maintains a nuclear material security index.[62] The Lifeboat Foundation (est. 2009) funds research into preventing a technological catastrophe.[63] Most of the research money funds projects at universities.[64] The Global Catastrophic Risk Institute (est. 2011) is a US-based non-profit, non-partisan think tank founded by Seth Baum and Tony Barrett. GCRI does research and policy work across various risks, including artificial intelligence, nuclear war, climate change, and asteroid impacts.[65] The Global Challenges Foundation (est. 2012), based in Stockholm and founded by Laszlo Szombatfalvy, releases a yearly report on the state of global risks.[66][67] The Future of Life Institute (est. 2014) works to reduce extreme, large-scale risks from transformative technologies, as well as steer the development and use of these technologies to benefit all life, through grantmaking, policy advocacy in the United States, European Union and United Nations, and educational outreach.[7] Elon Musk, Vitalik Buterin and Jaan Tallinn are some of its biggest donors.[68] The Center on Long-Term Risk (est. 2016), formerly known as the Foundational Research Institute, is a British organization focused on reducing risks of astronomical suffering (s-risks) from emerging technologies.[69]

University-based organizations include the Future of Humanity Institute (est. 2005) which researches the questions of humanity's long-term future, particularly existential risk.[5] It was founded by Nick Bostrom and is based at Oxford University.[5] The Centre for the Study of Existential Risk (est. 2012) is a Cambridge University-based organization which studies four major technological risks: artificial intelligence, biotechnology, global warming and warfare.[6] All are man-made risks, as Huw Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us."[70] Stephen Hawking was an acting adviser. The Millennium Alliance for Humanity and the Biosphere is a Stanford University-based organization focusing on many issues related to global catastrophe by bringing together members of academia in the humanities.[71][72] It was founded by Paul Ehrlich, among others.[73] Stanford University also has the Center for International Security and Cooperation focusing on political cooperation to reduce global catastrophic risk.[74] The Center for Security and Emerging Technology was established in January 2019 at Georgetown's Walsh School of Foreign Service and will focus on policy research of emerging technologies with an initial emphasis on artificial intelligence.[75] They received a grant of 55M USD from Good Ventures as suggested by Open Philanthropy.[75]

Other risk assessment groups are based in or are part of governmental organizations. The World Health Organization (WHO) includes a division called the Global Alert and Response (GAR) which monitors and responds to global epidemic crisis.[76] GAR helps member states with training and coordination of response to epidemics.[77] The United States Agency for International Development (USAID) has its Emerging Pandemic Threats Program which aims to prevent and contain naturally generated pandemics at their source.[78] The Lawrence Livermore National Laboratory has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio-security and counter-terrorism.[79]

See also

References

  1. ^ Schulte, P.; et al. (March 5, 2010). "The Chicxulub Asteroid Impact and Mass Extinction at the Cretaceous-Paleogene Boundary" (PDF). Science. 327 (5970): 1214–1218. Bibcode:2010Sci...327.1214S. doi:10.1126/science.1177265. PMID 20203042. S2CID 2659741.
  2. ^ Bostrom, Nick (2008). Global Catastrophic Risks (PDF). Oxford University Press. p. 1.
  3. ^ Ripple WJ, Wolf C, Newsome TM, Galetti M, Alamgir M, Crist E, Mahmoud MI, Laurance WF (November 13, 2017). "World Scientists' Warning to Humanity: A Second Notice". BioScience. 67 (12): 1026–1028. doi:10.1093/biosci/bix125.
  4. ^ Bostrom, Nick (March 2002). "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards". Journal of Evolution and Technology. 9.
  5. ^ a b c "About FHI". Future of Humanity Institute. Retrieved August 12, 2021.
  6. ^ a b "About us". Centre for the Study of Existential Risk. Retrieved August 12, 2021.
  7. ^ a b "The Future of Life Institute". Future of Life Institute. Retrieved May 5, 2014.
  8. ^ a b "Nuclear Threat Initiative". Nuclear Threat Initiative. Retrieved June 5, 2015.
  9. ^ a b c d e f Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority" (PDF). Global Policy. 4 (1): 15–3. doi:10.1111/1758-5899.12002 – via Existential Risk.
  10. ^ Bostrom, Nick; Cirkovic, Milan (2008). Global Catastrophic Risks. Oxford: Oxford University Press. p. 1. ISBN 978-0-19-857050-9.
  11. ^ Ziegler, Philip (2012). The Black Death. Faber and Faber. p. 397. ISBN 9780571287116.
  12. ^ Muehlhauser, Luke (March 15, 2017). "How big a deal was the Industrial Revolution?". lukemuelhauser.com. Retrieved August 3, 2020.
  13. ^ Taubenberger, Jeffery; Morens, David (2006). "1918 Influenza: the Mother of All Pandemics". Emerging Infectious Diseases. 12 (1): 15–22. doi:10.3201/eid1201.050979. PMC 3291398. PMID 16494711.
  14. ^ Posner, Richard A. (2006). Catastrophe: Risk and Response. Oxford: Oxford University Press. ISBN 978-0195306477. Introduction, "What is Catastrophe?"
  15. ^ Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. New York: Hachette. ISBN 9780316484916. This is an equivalent, though crisper statement of Nick Bostrom's definition: "An existential risk is one that threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development." Source: Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority". Global Policy. 4:15-31.
  16. ^ Cotton-Barratt, Owen; Ord, Toby (2015), Existential risk and existential hope: Definitions (PDF), Future of Humanity Institute – Technical Report #2015-1, pp. 1–4
  17. ^ Bostrom, Nick (2009). "Astronomical Waste: The opportunity cost of delayed technological development". Utilitas. 15 (3): 308–314. CiteSeerX 10.1.1.429.2849. doi:10.1017/s0953820800004076. S2CID 15860897.
  18. ^ a b c d e f g h Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. New York: Hachette. ISBN 9780316484916.
  19. ^ a b Bryan Caplan (2008). "The totalitarian threat". Global Catastrophic Risks, eds. Bostrom & Cirkovic (Oxford University Press): 504–519. ISBN 9780198570509
  20. ^ Glover, Dennis (June 1, 2017). "Did George Orwell secretly rewrite the end of Nineteen Eighty-Four as he lay dying?". The Sydney Morning Herald. Retrieved November 21, 2021. Winston's creator, George Orwell, believed that freedom would eventually defeat the truth-twisting totalitarianism portrayed in Nineteen Eighty-Four.
  21. ^ Orwell, George (1949). Nineteen Eighty-Four. A novel. London: Secker & Warburg.
  22. ^ Sagan, Carl (Winter 1983). "Nuclear War and Climatic Catastrophe: Some Policy Implications". Foreign Affairs. Council on Foreign Relations. doi:10.2307/20041818. JSTOR 20041818. Retrieved August 4, 2020.
  23. ^ Jebari, Karim (2014). "Existential Risks: Exploring a Robust Risk Reduction Strategy" (PDF). Science and Engineering Ethics. 21 (3): 541–54. doi:10.1007/s11948-014-9559-3. PMID 24891130. S2CID 30387504. Retrieved August 26, 2018.
  24. ^ a b Cirkovic, Milan M.; Bostrom, Nick; Sandberg, Anders (2010). "Anthropic Shadow: Observation Selection Effects and Human Extinction Risks" (PDF). Risk Analysis. 30 (10): 1495–1506. doi:10.1111/j.1539-6924.2010.01460.x. PMID 20626690. S2CID 6485564.
  25. ^ Kemp, Luke (February 2019). "Are we on the road to civilization collapse?". BBC. Retrieved August 12, 2021.
  26. ^ Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. ISBN 9780316484893. Europe survived losing 25 to 50 percent of its population in the Black Death, while keeping civilization firmly intact
  27. ^ a b Yudkowsky, Eliezer (2008). "Cognitive Biases Potentially Affecting Judgment of Global Risks" (PDF). Global Catastrophic Risks: 91–119. Bibcode:2008gcr..book...86Y.
  28. ^ Desvousges, W.H., Johnson, F.R., Dunford, R.W., Boyle, K.J., Hudson, S.P., and Wilson, N. 1993, Measuring natural resource damages with contingent valuation: tests of validity and reliability. In Hausman, J.A. (ed), Contingent Valuation:A Critical Assessment, pp. 91–159 (Amsterdam: North Holland).
  29. ^ Bostrom 2013.
  30. ^ Yudkowsky, Eliezer. "Cognitive biases potentially affecting judgment of global risks". Global catastrophic risks 1 (2008): 86. p.114
  31. ^ "We're Underestimating the Risk of Human Extinction". The Atlantic. March 6, 2012. Retrieved July 1, 2016.
  32. ^ Is Humanity Suicidal? The New York Times Magazine May 30, 1993)
  33. ^ a b Cotton-Barratt, Owen; Daniel, Max; Sandberg, Anders (2020). "Defence in Depth Against Human Extinction: Prevention, Response, Resilience, and Why They All Matter". Global Policy. 11 (3): 271–282. doi:10.1111/1758-5899.12786. ISSN 1758-5899. PMC 7228299. PMID 32427180.
  34. ^ a b Kupferschmidt, Kai (January 11, 2018). "Could science destroy the world? These scholars want to save us from a modern-day Frankenstein". Science. AAAS. Retrieved April 20, 2020.
  35. ^ "Oxford Institute Forecasts The Possible Doom Of Humanity". Popular Science. 2013. Retrieved April 20, 2020.
  36. ^ Toby Ord (2020). The precipice: Existential risk and the future of humanity. ISBN 9780316484893. The international body responsible for the continued prohibition of bioweapons (the Biological Weapons Convention) has an annual budget of $1.4 million - less than the average McDonald's restaurant
  37. ^ a b Matheny, Jason Gaverick (2007). "Reducing the Risk of Human Extinction" (PDF). Risk Analysis. 27 (5): 1335–1344. doi:10.1111/j.1539-6924.2007.00960.x. PMID 18076500. S2CID 14265396.
  38. ^ Wells, Willard. (2009). Apocalypse when?. Praxis. ISBN 978-0387098364.
  39. ^ Wells, Willard. (2017). Prospects for Human Survival. Lifeboat Foundation. ISBN 978-0998413105.
  40. ^ Hanson, Robin. "Catastrophe, social collapse, and human extinction". Global catastrophic risks 1 (2008): 357.
  41. ^ Smil, Vaclav (2003). The Earth's Biosphere: Evolution, Dynamics, and Change. MIT Press. p. 25. ISBN 978-0-262-69298-4.
  42. ^ Denkenberger, David C.; Sandberg, Anders; Tieman, Ross John; Pearce, Joshua M. (2022). "Long term cost-effectiveness of resilient foods for global catastrophes compared to artificial general intelligence safety". International Journal of Disaster Risk Reduction. 73: 102798. doi:10.1016/j.ijdrr.2022.102798.
  43. ^ Lewis Smith (February 27, 2008). . The Times Online. London. Archived from the original on May 12, 2008.
  44. ^ Suzanne Goldenberg (May 20, 2015). "The doomsday vault: the seeds that could save a post-apocalyptic world". The Guardian. Retrieved June 30, 2017.
  45. ^ "Here's how the world could end—and what we can do about it". Science. AAAS. July 8, 2016. Retrieved March 23, 2018.
  46. ^ Denkenberger, David C.; Pearce, Joshua M. (September 2015). "Feeding everyone: Solving the food crisis in event of global catastrophes that kill crops or obscure the sun" (PDF). Futures. 72: 57–68. doi:10.1016/j.futures.2014.11.008. S2CID 153917693.
  47. ^ . globalchallenges.org. Archived from the original on August 16, 2017. Retrieved August 15, 2017.
  48. ^ "Global Catastrophic Risk Policy". gcrpolicy.com. Retrieved August 11, 2019.
  49. ^ Club of Rome (2018). "The Climate Emergency Plan". Retrieved August 17, 2020.
  50. ^ Club of Rome (2019). "The Planetary Emergency Plan". Retrieved August 17, 2020.
  51. ^ Kieft, J.; Bendell, J (2021). "The responsibility of communicating difficult truths about climate influenced societal disruption and collapse: an introduction to psychological research". Institute for Leadership and Sustainability (IFLAS) Occasional Papers. 7: 1–39.
  52. ^ "Mankind must abandon earth or face extinction: Hawking", physorg.com, August 9, 2010, retrieved January 23, 2012
  53. ^ Malik, Tariq (April 13, 2013). "Stephen Hawking: Humanity Must Colonize Space to Survive". Space.com. Retrieved July 1, 2016.
  54. ^ Shukman, David (January 19, 2016). "Hawking: Humans at risk of lethal 'own goal'". BBC News. Retrieved July 1, 2016.
  55. ^ Ginsberg, Leah. "Elon Musk thinks life on earth will go extinct, and is putting most of his fortune toward colonizing Mars". CNBC.
  56. ^ Korycansky, Donald G.; Laughlin, Gregory; Adams, Fred C. (2001). "Astronomical engineering: a strategy for modifying planetary orbits". Astrophysics and Space Science. 275 (4): 349–366. arXiv:astro-ph/0102126. Bibcode:2001Ap&SS.275..349K. doi:10.1023/A:1002790227314. hdl:2027.42/41972. S2CID 5550304.
  57. ^ Fred Hapgood (November 1986). (PDF). Omni. Archived from the original (PDF) on July 27, 2013. Retrieved June 5, 2015.
  58. ^ Giles, Jim (2004). "Nanotech takes small step towards burying 'grey goo'". Nature. 429 (6992): 591. Bibcode:2004Natur.429..591G. doi:10.1038/429591b. PMID 15190320.
  59. ^ Sophie McBain (September 25, 2014). "Apocalypse soon: the scientists preparing for the end times". New Statesman. Retrieved June 5, 2015.
  60. ^ "Reducing Long-Term Catastrophic Risks from Artificial Intelligence". Machine Intelligence Research Institute. Retrieved June 5, 2015. The Machine Intelligence Research Institute aims to reduce the risk of a catastrophe, should such an event eventually occur.
  61. ^ Angela Chen (September 11, 2014). "Is Artificial Intelligence a Threat?". The Chronicle of Higher Education. Retrieved June 5, 2015.
  62. ^ Alexander Sehmar (May 31, 2015). . The Independent. Archived from the original on June 2, 2015. Retrieved June 5, 2015.
  63. ^ "About the Lifeboat Foundation". The Lifeboat Foundation. Retrieved April 26, 2013.
  64. ^ Ashlee Vance (July 20, 2010). "The Lifeboat Foundation: Battling Asteroids, Nanobots and A.I." New York Times. Retrieved June 5, 2015.
  65. ^ "Global Catastrophic Risk Institute". gcrinstitute.org. Retrieved March 22, 2022.
  66. ^ Meyer, Robinson (April 29, 2016). "Human Extinction Isn't That Unlikely". The Atlantic. Boston, Massachusetts: Emerson Collective. Retrieved April 30, 2016.
  67. ^ "Global Challenges Foundation website". globalchallenges.org. Retrieved April 30, 2016.
  68. ^ Nick Bilton (May 28, 2015). "Ava of 'Ex Machina' Is Just Sci-Fi (for Now)". New York Times. Retrieved June 5, 2015.
  69. ^ "About Us". Center on Long-Term Risk. Retrieved May 17, 2020. We currently focus on efforts to reduce the worst risks of astronomical suffering (s-risks) from emerging technologies, with a focus on transformative artificial intelligence.
  70. ^ Hui, Sylvia (November 25, 2012). . Associated Press. Archived from the original on December 1, 2012. Retrieved January 30, 2012.
  71. ^ Scott Barrett (2014). Environment and Development Economics: Essays in Honour of Sir Partha Dasgupta. Oxford University Press. p. 112. ISBN 9780199677856. Retrieved June 5, 2015.
  72. ^ "Millennium Alliance for Humanity & The Biosphere". Millennium Alliance for Humanity & The Biosphere. Retrieved June 5, 2015.
  73. ^ Guruprasad Madhavan (2012). Practicing Sustainability. Springer Science & Business Media. p. 43. ISBN 9781461443483. Retrieved June 5, 2015.
  74. ^ "Center for International Security and Cooperation". Center for International Security and Cooperation. Retrieved June 5, 2015.
  75. ^ a b Anderson, Nick (February 28, 2019). "Georgetown launches think tank on security and emerging technology". Washington Post. Retrieved March 12, 2019.
  76. ^ . World Health Organization. Archived from the original on February 16, 2003. Retrieved June 5, 2015.
  77. ^ Kelley Lee (2013). Historical Dictionary of the World Health Organization. Rowman & Littlefield. p. 92. ISBN 9780810878587. Retrieved June 5, 2015.
  78. ^ . USAID. Archived from the original on October 22, 2014. Retrieved June 5, 2015.
  79. ^ "Global Security". Lawrence Livermore National Laboratory. Retrieved June 5, 2015.

Further reading

External links

  • "Are we on the road to civilisation collapse?". BBC. February 19, 2019.
  • MacAskill, William (August 5, 2022). "The Case for Longtermism". The New York Times.
  • "What a way to go" from The Guardian. Ten scientists name the biggest dangers to Earth and assess the chances they will happen. April 14, 2005.
  • Humanity under threat from perfect storm of crises – study. The Guardian. February 6, 2020.
  • Annual Reports on Global Risk by the Global Challenges Foundation
  • Center on Long-Term Risk
  • Global Catastrophic Risk Policy
  • Stephen Petranek: 10 ways the world could end, a TED talk

global, catastrophic, risk, doomsday, scenario, existential, threat, redirect, here, other, uses, doomsday, disambiguation, song, sparks, steady, drip, drip, drip, book, nick, bostrom, global, catastrophic, risks, book, history, idea, probability, ethics, huma. Doomsday scenario and Existential threat redirect here For other uses see Doomsday disambiguation For the song by Sparks see A Steady Drip Drip Drip For the book by Nick Bostrom see Global Catastrophic Risks book For the history of the idea probability and ethics see Human extinction A global catastrophic risk or a doomsday scenario is a hypothetical future event that could damage human well being on a global scale 2 even endangering or destroying modern civilization 3 An event that could cause human extinction or permanently and drastically curtail humanity s potential is known as an existential risk 4 Artist s impression of a major asteroid impact An asteroid caused the extinction of the non avian dinosaurs 1 Over the last two decades a number of academic and non profit organizations have been established to research global catastrophic and existential risks formulate potential mitigation measures and either advocate for or implement these measures 5 6 7 8 Contents 1 Definition and classification 1 1 Defining global catastrophic risks 1 2 Defining existential risks 1 2 1 Non extinction risks 2 Potential sources of risk 3 Methodological challenges 3 1 Lack of historical precedent 3 2 Incentives and coordination 3 3 Cognitive biases 4 Proposed mitigation 4 1 Multi layer defense 4 2 Funding 4 3 Survival planning 4 4 Global catastrophic risks and global governance 4 5 Climate emergency plans 4 6 Space colonization 4 7 Moving the Earth 4 8 Skeptics and opponents 5 Organizations 6 See also 7 References 8 Further reading 9 External linksDefinition and classification Edit Scope intensity grid from Bostrom s paper Existential Risk Prevention as Global Priority 9 Defining global catastrophic risks Edit The term global catastrophic risk lacks a sharp definition and generally refers loosely to a risk that could inflict serious damage to human well being on a global scale 10 Humanity has suffered large catastrophes before Some of these have caused serious damage but were only local in scope e g the Black Death may have resulted in the deaths of a third of Europe s population 11 10 of the global population at the time 12 Some were global but were not as severe e g the 1918 influenza pandemic killed an estimated 3 6 of the world s population 13 Most global catastrophic risks would not be so intense as to kill the majority of life on earth but even if one did the ecosystem and humanity would eventually recover in contrast to existential risks Similarly in Catastrophe Risk and Response Richard Posner singles out and groups together events that bring about utter overthrow or ruin on a global rather than a local or regional scale Posner highlights such events as worthy of special attention on cost benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole 14 Defining existential risks Edit Existential risks are defined as risks that threaten the destruction of humanity s long term potential 15 The instantiation of an existential risk an existential catastrophe 16 would either cause outright human extinction or irreversibly lock in a drastically inferior state of affairs 9 17 Existential risks are a sub class of global catastrophic risks where the damage is not only global but also terminal and permanent preventing recovery and thereby affecting both current and all future generations 9 Non extinction risks Edit While extinction is the most obvious way in which humanity s long term potential could be destroyed there are others including unrecoverable collapse and unrecoverable dystopia 18 A disaster severe enough to cause the permanent irreversible collapse of human civilisation would constitute an existential catastrophe even if it fell short of extinction 18 Similarly if humanity fell under a totalitarian regime and there were no chance of recovery then such a dystopia would also be an existential catastrophe 19 Bryan Caplan writes that perhaps an eternity of totalitarianism would be worse than extinction 19 George Orwell s novel Nineteen Eighty Four suggests 20 an example 21 A dystopian scenario shares the key features of extinction and unrecoverable collapse of civilisation before the catastrophe humanity faced a vast range of bright futures to choose from after the catastrophe humanity is locked forever in a terrible state 18 Potential sources of risk EditMain article Global catastrophe scenarios Potential global catastrophic risks are conventionally classified as anthropogenic or non anthropogenic hazards Examples of non anthropogenic risks are an asteroid impact event a supervolcanic eruption a lethal gamma ray burst a geomagnetic storm destroying electronic equipment natural long term climate change hostile extraterrestrial life or the predictable Sun transforming into a red giant star engulfing the Earth Anthropogenic risks are those caused by humans and include those related to technology governance and climate change Technological risks include the creation of artificial intelligence misaligned with human goals biotechnology and nanotechnology Insufficient or malign global governance creates risks in the social and political domain such as global war and nuclear holocaust bioterrorism using genetically modified organisms cyberterrorism destroying critical infrastructure like the electrical grid or the failure to manage a natural or engineered pandemic Global catastrophic risks in the domain of earth system governance include global warming environmental degradation extinction of species famine as a result of non equitable resource distribution human overpopulation crop failures and non sustainable agriculture Methodological challenges EditResearch into the nature and mitigation of global catastrophic risks and existential risks is subject to a unique set of challenges and as a result is not easily subjected to the usual standards of scientific rigour 18 For instance it is neither feasible nor ethical to study these risks experimentally Carl Sagan expressed this with regards to nuclear war Understanding the long term consequences of nuclear war is not a problem amenable to experimental verification 22 Moreover many catastrophic risks change rapidly as technology advances and background conditions such as geopolitical conditions change Another challenge is the general difficulty of accurately predicting the future over long timescales especially for anthropogenic risks which depend on complex human political economic and social systems 18 In addition to known and tangible risks unforeseeable black swan extinction events may occur presenting an additional methodological problem 18 23 Lack of historical precedent Edit Humanity has never suffered an existential catastrophe and if one were to occur it would necessarily be unprecedented 18 Therefore existential risks pose unique challenges to prediction even more than other long term events because of observation selection effects 24 Unlike with most events the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future because every world that has experienced such an extinction event has no observers so regardless of their frequency no civilization observes existential risks in its history 24 These anthropic issues may partly be avoided by looking at evidence that does not have such selection effects such as asteroid impact craters on the Moon or directly evaluating the likely impact of new technology 9 To understand the dynamics of an unprecedented unrecoverable global civilizational collapse a type of existential risk it may be instructive to study the various local civilizational collapses that have occurred throughout human history 25 For instance civilizations such as the Roman Empire have ended in a loss of centralized governance and a major civilization wide loss of infrastructure and advanced technology However these examples demonstrate that societies appear to be fairly resilient to catastrophe for example Medieval Europe survived the Black Death without suffering anything resembling a civilization collapse despite losing 25 to 50 percent of its population 26 Incentives and coordination Edit There are economic reasons that can explain why so little effort is going into existential risk reduction It is a global public good so we should expect it to be undersupplied by markets 9 Even if a large nation invests in risk mitigation measures that nation will enjoy only a small fraction of the benefit of doing so Furthermore existential risk reduction is an intergenerational global public good since most of the benefits of existential risk reduction would be enjoyed by future generations and though these future people would in theory perhaps be willing to pay substantial sums for existential risk reduction no mechanism for such a transaction exists 9 Cognitive biases Edit Numerous cognitive biases can influence people s judgment of the importance of existential risks including scope insensitivity hyperbolic discounting availability heuristic the conjunction fallacy the affect heuristic and the overconfidence effect 27 Scope insensitivity influences how bad people consider the extinction of the human race to be For example when people are motivated to donate money to altruistic causes the quantity they are willing to give does not increase linearly with the magnitude of the issue people are roughly as willing to prevent the deaths of 200 000 or 2 000 birds 28 Similarly people are often more concerned about threats to individuals than to larger groups 27 Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks 29 30 Substantially larger numbers such as 500 million deaths and especially qualitatively different scenarios such as the extinction of the entire human species seem to trigger a different mode of thinking People who would never dream of hurting a child hear of existential risk and say Well maybe the human species doesn t really deserve to survive All past predictions of human extinction have proven to be false To some this makes future warnings seem less credible Nick Bostrom argues that the absence of human extinction in the past is weak evidence that there will be no human extinction in the future due to survivor bias and other anthropic effects 31 Sociobiologist E O Wilson argued that The reason for this myopic fog evolutionary biologists contend is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo A premium was placed on close attention to the near future and early reproduction and little else Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth 32 Proposed mitigation EditMulti layer defense Edit Defense in depth is a useful framework for categorizing risk mitigation measures into three layers of defense 33 Prevention Reducing the probability of a catastrophe occurring in the first place Example Measures to prevent outbreaks of new highly infectious diseases Response Preventing the scaling of a catastrophe to the global level Example Measures to prevent escalation of a small scale nuclear exchange into an all out nuclear war Resilience Increasing humanity s resilience against extinction when faced with global catastrophes Example Measures to increase food security during a nuclear winter Human extinction is most likely when all three defenses are weak that is by risks we are unlikely to prevent unlikely to successfully respond to and unlikely to be resilient against 33 The unprecedented nature of existential risks poses a special challenge in designing risk mitigation measures since humanity will not be able to learn from a track record of previous events 18 Funding Edit Some researchers argue that both research and other initiatives relating to existential risk are underfunded Nick Bostrom states that more research has been done on Star Trek snowboarding or dung beetles than on existential risks Bostrom s comparisons have been criticized as high handed 34 35 As of 2020 the Biological Weapons Convention organization had an annual budget of US 1 4 million 36 Survival planning Edit Some scholars propose the establishment on Earth of one or more self sufficient remote permanently occupied settlements specifically created for the purpose of surviving a global disaster 37 38 39 Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes 37 40 Food storage has been proposed globally but the monetary cost would be high Furthermore it would likely contribute to the current millions of deaths per year due to malnutrition 41 In 2022 a team led by David Denkenberger modeled the cost effectiveness of resilient foods to artificial general intelligence AGI safety and found 98 99 confidence for a higher marginal impact of work on resilient foods 42 Some survivalists stock survival retreats with multiple year food supplies The Svalbard Global Seed Vault is buried 400 feet 120 m inside a mountain on an island in the Arctic It is designed to hold 2 5 billion seeds from more than 100 countries as a precaution to preserve the world s crops The surrounding rock is 6 C 21 F as of 2015 but the vault is kept at 18 C 0 F by refrigerators powered by locally sourced coal 43 44 More speculatively if society continues to function and if the biosphere remains habitable calorie needs for the present human population might in theory be met during an extended absence of sunlight given sufficient advance planning Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe converting cellulose to sugar or feeding natural gas to methane digesting bacteria 45 46 Global catastrophic risks and global governance Edit Insufficient global governance creates risks in the social and political domain but the governance mechanisms develop more slowly than technological and social change There are concerns from governments the private sector as well as the general public about the lack of governance mechanisms to efficiently deal with risks negotiate and adjudicate between diverse and conflicting interests This is further underlined by an understanding of the interconnectedness of global systemic risks 47 In absence or anticipation of global governance national governments can act individually to better understand mitigate and prepare for global catastrophes 48 Climate emergency plans Edit In 2018 the Club of Rome called for greater climate change action and published its Climate Emergency Plan which proposes ten action points to limit global average temperature increase to 1 5 degrees Celsius 49 Further in 2019 the Club published the more comprehensive Planetary Emergency Plan 50 There is evidence to suggest that collectively engaging with the emotional experiences that emerge during contemplating the vulnerability of the human species within the context of climate change allows for these experiences to be adaptive When collective engaging with and processing emotional experiences is supportive this can lead to growth in resilience psychological flexibility tolerance of emotional experiences and community engagement 51 Space colonization Edit Main article Space and survival Space colonization is a proposed alternative to improve the odds of surviving an extinction scenario 52 Solutions of this scope may require megascale engineering Astrophysicist Stephen Hawking advocated colonizing other planets within the solar system once technology progresses sufficiently in order to improve the chance of human survival from planet wide events such as global thermonuclear war 53 54 Billionaire Elon Musk writes that humanity must become a multiplanetary species in order to avoid extinction 55 Musk is using his company SpaceX to develop technology he hopes will be used in the colonization of Mars Moving the Earth Edit Main article Moving Earth In a few billion years the Sun will expand into a red giant swallowing the Earth This can be avoided by moving the Earth farther out from the Sun keeping the temperature roughly constant That can be accomplished by tweaking the orbits of comets and asteroids so they pass close to the Earth in such a way that they add energy to the Earth s orbit 56 Since the Sun s expansion is slow roughly one such encounter every 6 000 years would suffice citation needed Skeptics and opponents Edit Psychologist Steven Pinker has called existential risk a useless category that can distract from real threats such as climate change and nuclear war 34 Organizations EditThe Bulletin of the Atomic Scientists est 1945 is one of the oldest global risk organizations founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII It studies risks associated with nuclear war and energy and famously maintains the Doomsday Clock established in 1947 The Foresight Institute est 1986 examines the risks of nanotechnology and its benefits It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale It was founded by K Eric Drexler who postulated grey goo 57 58 Beginning after 2000 a growing number of scientists philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia 59 Independent non governmental organizations NGOs include the Machine Intelligence Research Institute est 2000 which aims to reduce the risk of a catastrophe caused by artificial intelligence 60 with donors including Peter Thiel and Jed McCaleb 61 The Nuclear Threat Initiative est 2001 seeks to reduce global threats from nuclear biological and chemical threats and containment of damage after an event 8 It maintains a nuclear material security index 62 The Lifeboat Foundation est 2009 funds research into preventing a technological catastrophe 63 Most of the research money funds projects at universities 64 The Global Catastrophic Risk Institute est 2011 is a US based non profit non partisan think tank founded by Seth Baum and Tony Barrett GCRI does research and policy work across various risks including artificial intelligence nuclear war climate change and asteroid impacts 65 The Global Challenges Foundation est 2012 based in Stockholm and founded by Laszlo Szombatfalvy releases a yearly report on the state of global risks 66 67 The Future of Life Institute est 2014 works to reduce extreme large scale risks from transformative technologies as well as steer the development and use of these technologies to benefit all life through grantmaking policy advocacy in the United States European Union and United Nations and educational outreach 7 Elon Musk Vitalik Buterin and Jaan Tallinn are some of its biggest donors 68 The Center on Long Term Risk est 2016 formerly known as the Foundational Research Institute is a British organization focused on reducing risks of astronomical suffering s risks from emerging technologies 69 University based organizations include the Future of Humanity Institute est 2005 which researches the questions of humanity s long term future particularly existential risk 5 It was founded by Nick Bostrom and is based at Oxford University 5 The Centre for the Study of Existential Risk est 2012 is a Cambridge University based organization which studies four major technological risks artificial intelligence biotechnology global warming and warfare 6 All are man made risks as Huw Price explained to the AFP news agency It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology He added that when this happens we re no longer the smartest things around and will risk being at the mercy of machines that are not malicious but machines whose interests don t include us 70 Stephen Hawking was an acting adviser The Millennium Alliance for Humanity and the Biosphere is a Stanford University based organization focusing on many issues related to global catastrophe by bringing together members of academia in the humanities 71 72 It was founded by Paul Ehrlich among others 73 Stanford University also has the Center for International Security and Cooperation focusing on political cooperation to reduce global catastrophic risk 74 The Center for Security and Emerging Technology was established in January 2019 at Georgetown s Walsh School of Foreign Service and will focus on policy research of emerging technologies with an initial emphasis on artificial intelligence 75 They received a grant of 55M USD from Good Ventures as suggested by Open Philanthropy 75 Other risk assessment groups are based in or are part of governmental organizations The World Health Organization WHO includes a division called the Global Alert and Response GAR which monitors and responds to global epidemic crisis 76 GAR helps member states with training and coordination of response to epidemics 77 The United States Agency for International Development USAID has its Emerging Pandemic Threats Program which aims to prevent and contain naturally generated pandemics at their source 78 The Lawrence Livermore National Laboratory has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio security and counter terrorism 79 See also EditApocalyptic and post apocalyptic fiction Artificial intelligence arms race Cataclysmic pole shift hypothesis Community resilience Doomsday cult Eschatology Extreme risk Failed state Fermi paradox Foresight psychology Future of Earth Future of the Solar System Geoengineering Global Risks Report Great Filter Holocene extinction Impact event List of global issues Nuclear proliferation Outside Context Problem Planetary boundaries Rare events The Sixth Extinction An Unnatural History Social degeneration Societal collapse Societal transformation Speculative evolution Survivalism Tail risk The Precipice Existential Risk and the Future of Humanity Timeline of the far future Ultimate fate of the universe World Scientists Warning to Humanity Portals Society Astronomy Stars Spaceflight Outer space Science WorldReferences Edit Schulte P et al March 5 2010 The Chicxulub Asteroid Impact and Mass Extinction at the Cretaceous Paleogene Boundary PDF Science 327 5970 1214 1218 Bibcode 2010Sci 327 1214S doi 10 1126 science 1177265 PMID 20203042 S2CID 2659741 Bostrom Nick 2008 Global Catastrophic Risks PDF Oxford University Press p 1 Ripple WJ Wolf C Newsome TM Galetti M Alamgir M Crist E Mahmoud MI Laurance WF November 13 2017 World Scientists Warning to Humanity A Second Notice BioScience 67 12 1026 1028 doi 10 1093 biosci bix125 Bostrom Nick March 2002 Existential Risks Analyzing Human Extinction Scenarios and Related Hazards Journal of Evolution and Technology 9 a b c About FHI Future of Humanity Institute Retrieved August 12 2021 a b About us Centre for the Study of Existential Risk Retrieved August 12 2021 a b The Future of Life Institute Future of Life Institute Retrieved May 5 2014 a b Nuclear Threat Initiative Nuclear Threat Initiative Retrieved June 5 2015 a b c d e f Bostrom Nick 2013 Existential Risk Prevention as Global Priority PDF Global Policy 4 1 15 3 doi 10 1111 1758 5899 12002 via Existential Risk Bostrom Nick Cirkovic Milan 2008 Global Catastrophic Risks Oxford Oxford University Press p 1 ISBN 978 0 19 857050 9 Ziegler Philip 2012 The Black Death Faber and Faber p 397 ISBN 9780571287116 Muehlhauser Luke March 15 2017 How big a deal was the Industrial Revolution lukemuelhauser com Retrieved August 3 2020 Taubenberger Jeffery Morens David 2006 1918 Influenza the Mother of All Pandemics Emerging Infectious Diseases 12 1 15 22 doi 10 3201 eid1201 050979 PMC 3291398 PMID 16494711 Posner Richard A 2006 Catastrophe Risk and Response Oxford Oxford University Press ISBN 978 0195306477 Introduction What is Catastrophe Ord Toby 2020 The Precipice Existential Risk and the Future of Humanity New York Hachette ISBN 9780316484916 This is an equivalent though crisper statement of Nick Bostrom s definition An existential risk is one that threatens the premature extinction of Earth originating intelligent life or the permanent and drastic destruction of its potential for desirable future development Source Bostrom Nick 2013 Existential Risk Prevention as Global Priority Global Policy 4 15 31 Cotton Barratt Owen Ord Toby 2015 Existential risk and existential hope Definitions PDF Future of Humanity Institute Technical Report 2015 1 pp 1 4 Bostrom Nick 2009 Astronomical Waste The opportunity cost of delayed technological development Utilitas 15 3 308 314 CiteSeerX 10 1 1 429 2849 doi 10 1017 s0953820800004076 S2CID 15860897 a b c d e f g h Ord Toby 2020 The Precipice Existential Risk and the Future of Humanity New York Hachette ISBN 9780316484916 a b Bryan Caplan 2008 The totalitarian threat Global Catastrophic Risks eds Bostrom amp Cirkovic Oxford University Press 504 519 ISBN 9780198570509 Glover Dennis June 1 2017 Did George Orwell secretly rewrite the end of Nineteen Eighty Four as he lay dying The Sydney Morning Herald Retrieved November 21 2021 Winston s creator George Orwell believed that freedom would eventually defeat the truth twisting totalitarianism portrayed in Nineteen Eighty Four Orwell George 1949 Nineteen Eighty Four A novel London Secker amp Warburg Sagan Carl Winter 1983 Nuclear War and Climatic Catastrophe Some Policy Implications Foreign Affairs Council on Foreign Relations doi 10 2307 20041818 JSTOR 20041818 Retrieved August 4 2020 Jebari Karim 2014 Existential Risks Exploring a Robust Risk Reduction Strategy PDF Science and Engineering Ethics 21 3 541 54 doi 10 1007 s11948 014 9559 3 PMID 24891130 S2CID 30387504 Retrieved August 26 2018 a b Cirkovic Milan M Bostrom Nick Sandberg Anders 2010 Anthropic Shadow Observation Selection Effects and Human Extinction Risks PDF Risk Analysis 30 10 1495 1506 doi 10 1111 j 1539 6924 2010 01460 x PMID 20626690 S2CID 6485564 Kemp Luke February 2019 Are we on the road to civilization collapse BBC Retrieved August 12 2021 Ord Toby 2020 The Precipice Existential Risk and the Future of Humanity ISBN 9780316484893 Europe survived losing 25 to 50 percent of its population in the Black Death while keeping civilization firmly intact a b Yudkowsky Eliezer 2008 Cognitive Biases Potentially Affecting Judgment of Global Risks PDF Global Catastrophic Risks 91 119 Bibcode 2008gcr book 86Y Desvousges W H Johnson F R Dunford R W Boyle K J Hudson S P and Wilson N 1993 Measuring natural resource damages with contingent valuation tests of validity and reliability In Hausman J A ed Contingent Valuation A Critical Assessment pp 91 159 Amsterdam North Holland Bostrom 2013 Yudkowsky Eliezer Cognitive biases potentially affecting judgment of global risks Global catastrophic risks 1 2008 86 p 114 We re Underestimating the Risk of Human Extinction The Atlantic March 6 2012 Retrieved July 1 2016 Is Humanity Suicidal The New York Times Magazine May 30 1993 a b Cotton Barratt Owen Daniel Max Sandberg Anders 2020 Defence in Depth Against Human Extinction Prevention Response Resilience and Why They All Matter Global Policy 11 3 271 282 doi 10 1111 1758 5899 12786 ISSN 1758 5899 PMC 7228299 PMID 32427180 a b Kupferschmidt Kai January 11 2018 Could science destroy the world These scholars want to save us from a modern day Frankenstein Science AAAS Retrieved April 20 2020 Oxford Institute Forecasts The Possible Doom Of Humanity Popular Science 2013 Retrieved April 20 2020 Toby Ord 2020 The precipice Existential risk and the future of humanity ISBN 9780316484893 The international body responsible for the continued prohibition of bioweapons the Biological Weapons Convention has an annual budget of 1 4 million less than the average McDonald s restaurant a b Matheny Jason Gaverick 2007 Reducing the Risk of Human Extinction PDF Risk Analysis 27 5 1335 1344 doi 10 1111 j 1539 6924 2007 00960 x PMID 18076500 S2CID 14265396 Wells Willard 2009 Apocalypse when Praxis ISBN 978 0387098364 Wells Willard 2017 Prospects for Human Survival Lifeboat Foundation ISBN 978 0998413105 Hanson Robin Catastrophe social collapse and human extinction Global catastrophic risks 1 2008 357 Smil Vaclav 2003 The Earth s Biosphere Evolution Dynamics and Change MIT Press p 25 ISBN 978 0 262 69298 4 Denkenberger David C Sandberg Anders Tieman Ross John Pearce Joshua M 2022 Long term cost effectiveness of resilient foods for global catastrophes compared to artificial general intelligence safety International Journal of Disaster Risk Reduction 73 102798 doi 10 1016 j ijdrr 2022 102798 Lewis Smith February 27 2008 Doomsday vault for world s seeds is opened under Arctic mountain The Times Online London Archived from the original on May 12 2008 Suzanne Goldenberg May 20 2015 The doomsday vault the seeds that could save a post apocalyptic world The Guardian Retrieved June 30 2017 Here s how the world could end and what we can do about it Science AAAS July 8 2016 Retrieved March 23 2018 Denkenberger David C Pearce Joshua M September 2015 Feeding everyone Solving the food crisis in event of global catastrophes that kill crops or obscure the sun PDF Futures 72 57 68 doi 10 1016 j futures 2014 11 008 S2CID 153917693 Global Challenges Foundation Understanding Global Systemic Risk globalchallenges org Archived from the original on August 16 2017 Retrieved August 15 2017 Global Catastrophic Risk Policy gcrpolicy com Retrieved August 11 2019 Club of Rome 2018 The Climate Emergency Plan Retrieved August 17 2020 Club of Rome 2019 The Planetary Emergency Plan Retrieved August 17 2020 Kieft J Bendell J 2021 The responsibility of communicating difficult truths about climate influenced societal disruption and collapse an introduction to psychological research Institute for Leadership and Sustainability IFLAS Occasional Papers 7 1 39 Mankind must abandon earth or face extinction Hawking physorg com August 9 2010 retrieved January 23 2012 Malik Tariq April 13 2013 Stephen Hawking Humanity Must Colonize Space to Survive Space com Retrieved July 1 2016 Shukman David January 19 2016 Hawking Humans at risk of lethal own goal BBC News Retrieved July 1 2016 Ginsberg Leah Elon Musk thinks life on earth will go extinct and is putting most of his fortune toward colonizing Mars CNBC Korycansky Donald G Laughlin Gregory Adams Fred C 2001 Astronomical engineering a strategy for modifying planetary orbits Astrophysics and Space Science 275 4 349 366 arXiv astro ph 0102126 Bibcode 2001Ap amp SS 275 349K doi 10 1023 A 1002790227314 hdl 2027 42 41972 S2CID 5550304 Fred Hapgood November 1986 Nanotechnology Molecular Machines that Mimic Life PDF Omni Archived from the original PDF on July 27 2013 Retrieved June 5 2015 Giles Jim 2004 Nanotech takes small step towards burying grey goo Nature 429 6992 591 Bibcode 2004Natur 429 591G doi 10 1038 429591b PMID 15190320 Sophie McBain September 25 2014 Apocalypse soon the scientists preparing for the end times New Statesman Retrieved June 5 2015 Reducing Long Term Catastrophic Risks from Artificial Intelligence Machine Intelligence Research Institute Retrieved June 5 2015 The Machine Intelligence Research Institute aims to reduce the risk of a catastrophe should such an event eventually occur Angela Chen September 11 2014 Is Artificial Intelligence a Threat The Chronicle of Higher Education Retrieved June 5 2015 Alexander Sehmar May 31 2015 Isis could obtain nuclear weapon from Pakistan warns India The Independent Archived from the original on June 2 2015 Retrieved June 5 2015 About the Lifeboat Foundation The Lifeboat Foundation Retrieved April 26 2013 Ashlee Vance July 20 2010 The Lifeboat Foundation Battling Asteroids Nanobots and A I New York Times Retrieved June 5 2015 Global Catastrophic Risk Institute gcrinstitute org Retrieved March 22 2022 Meyer Robinson April 29 2016 Human Extinction Isn t That Unlikely The Atlantic Boston Massachusetts Emerson Collective Retrieved April 30 2016 Global Challenges Foundation website globalchallenges org Retrieved April 30 2016 Nick Bilton May 28 2015 Ava of Ex Machina Is Just Sci Fi for Now New York Times Retrieved June 5 2015 About Us Center on Long Term Risk Retrieved May 17 2020 We currently focus on efforts to reduce the worst risks of astronomical suffering s risks from emerging technologies with a focus on transformative artificial intelligence Hui Sylvia November 25 2012 Cambridge to study technology s risks to humans Associated Press Archived from the original on December 1 2012 Retrieved January 30 2012 Scott Barrett 2014 Environment and Development Economics Essays in Honour of Sir Partha Dasgupta Oxford University Press p 112 ISBN 9780199677856 Retrieved June 5 2015 Millennium Alliance for Humanity amp The Biosphere Millennium Alliance for Humanity amp The Biosphere Retrieved June 5 2015 Guruprasad Madhavan 2012 Practicing Sustainability Springer Science amp Business Media p 43 ISBN 9781461443483 Retrieved June 5 2015 Center for International Security and Cooperation Center for International Security and Cooperation Retrieved June 5 2015 a b Anderson Nick February 28 2019 Georgetown launches think tank on security and emerging technology Washington Post Retrieved March 12 2019 Global Alert and Response GAR World Health Organization Archived from the original on February 16 2003 Retrieved June 5 2015 Kelley Lee 2013 Historical Dictionary of the World Health Organization Rowman amp Littlefield p 92 ISBN 9780810878587 Retrieved June 5 2015 USAID Emerging Pandemic Threats Program USAID Archived from the original on October 22 2014 Retrieved June 5 2015 Global Security Lawrence Livermore National Laboratory Retrieved June 5 2015 Further reading EditAvin Shahar Wintle Bonnie C Weitzdorfer Julius o Heigeartaigh Sean S Sutherland William J Rees Martin J 2018 Classifying global catastrophic risks Futures 102 20 26 doi 10 1016 j futures 2018 02 001 Corey S Powell 2000 Twenty ways the world could end suddenly Discover Magazine Derrick Jensen 2006 Endgame ISBN 1 58322 730 X Donella Meadows 1972 The Limits to Growth ISBN 0 87663 165 0 Edward O Wilson 2003 The Future of Life ISBN 0 679 76811 4 Holt Jim The Power of Catastrophic Thinking review of Toby Ord The Precipice Existential Risk and the Future of Humanity Hachette 2020 468 pp The New York Review of Books vol LXVIII no 3 February 25 2021 pp 26 29 Jim Holt writes p 28 Whether you are searching for a cure for cancer or pursuing a scholarly or artistic career or engaged in establishing more just institutions a threat to the future of humanity is also a threat to the significance of what you do Huesemann Michael H and Joyce A Huesemann 2011 Technofix Why Technology Won t Save Us or the Environment Chapter 6 Sustainability or Collapse New Society Publishers Gabriola Island British Columbia Canada 464 pages ISBN 0865717044 Jared Diamond Collapse How Societies Choose to Fail or Succeed Penguin Books 2005 and 2011 ISBN 9780241958681 Jean Francois Rischard 2003 High Noon 20 Global Problems 20 Years to Solve Them ISBN 0 465 07010 8 Joel Garreau Radical Evolution 2005 ISBN 978 0385509657 John A Leslie 1996 The End of the World ISBN 0 415 14043 9 Joseph Tainter 1990 The Collapse of Complex Societies Cambridge University Press Cambridge UK ISBN 9780521386739 Martin Rees 2004 Our Final Hour A Scientist s warning How Terror Error and Environmental Disaster Threaten Humankind s Future in This Century On Earth and Beyond ISBN 0 465 06863 4 Roger Maurice Bonnet and Lodewijk Woltjer Surviving 1 000 Centuries Can We Do It 2008 Springer Praxis Books Toby Ord 2020 The Precipice Existential Risk and the Future of Humanity Bloomsbury Publishing ISBN 9781526600219External links Edit Wikiquote has quotations related to Global catastrophic risk Are we on the road to civilisation collapse BBC February 19 2019 MacAskill William August 5 2022 The Case for Longtermism The New York Times What a way to go from The Guardian Ten scientists name the biggest dangers to Earth and assess the chances they will happen April 14 2005 Humanity under threat from perfect storm of crises study The Guardian February 6 2020 Annual Reports on Global Risk by the Global Challenges Foundation Center on Long Term Risk Global Catastrophic Risk Policy Stephen Petranek 10 ways the world could end a TED talk Retrieved from https en wikipedia org w index php title Global catastrophic risk amp oldid 1140772429, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.