Global catastrophic risk

"Global catastrophic risks" redirects here. For the book, see Global Catastrophic Risks (book).
Artist's impression of a major asteroid impact. An asteroid with an impact strength of a billion atomic bombs may have caused the extinction of the dinosaurs.[1]

A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.[2] Some events could cripple or destroy modern civilization. Any event that could cause human extinction is also known as an existential risk.[3]

Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war and pandemics.

Researchers experience difficulty in studying near human extinction directly, since humanity has never been destroyed before.[4] While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias.

Classifications

Scope/intensity grid from Bostrom's paper "Existential Risk Prevention as Global Priority"[5]

Global catastrophic vs existential

Philosopher Nick Bostrom classifies risks according to their scope and intensity.[6] A "global catastrophic risk" is any risk that is at least "global" in scope, and is not subjectively "imperceptible" in intensity. Those that are at least "trans-generational" (affecting all future generations) in scope and "terminal" in intensity are classified as existential risks. While a global catastrophic risk may kill the vast majority of life on earth, humanity could still potentially recover. An existential risk, on the other hand, is one that either destroys humanity (and, presumably, all but the most rudimentary species of non-human lifeforms and/or plant life) entirely or prevents any chance of civilization recovering. Bostrom considers existential risks to be far more significant.[7]

Similarly, in Catastrophe: Risk and Response, Richard Posner singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional" scale. Posner singles out such events as worthy of special attention on cost-benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole.[8] Posner's events include meteor impacts, runaway global warming, grey goo, bioterrorism, and particle accelerator accidents.

Other classifications

Bostrom identifies four types of existential risk. "Bangs" are sudden catastrophes, which may be accidental or deliberate. He thinks the most likely sources of bangs are malicious use of nanotechnology, nuclear war, and the possibility that the universe is a simulation that will end. "Crunches" are scenarios in which humanity survives but civilization is irreversibly destroyed. The most likely causes of this, he believes, are exhaustion of natural resources, a stable global government that prevents technological progress, or dysgenic pressures that lower average intelligence. "Shrieks" are undesirable futures. For example, if a single mind enhances its powers by merging with a computer, it could dominate human civilization. Bostrom believes that this scenario is most likely, followed by flawed superintelligence and a repressive totalitarian regime. "Whimpers" are the gradual decline of human civilization or current values. He thinks the most likely cause would be evolution changing moral preference, followed by extraterrestrial invasion.[3]

Likelihood

Some risks, such as that from asteroid impact, with a one-in-a-million chance of causing humanity's extinction in the next century,[9] have had their probabilities predicted with considerable precision (although some scholars claim the actual rate of large impacts could be much higher than originally calculated).[10] Similarly, the frequency of volcanic eruptions of sufficient magnitude to cause catastrophic climate change, similar to the Toba Eruption, which may have almost caused the extinction of the human race,[11] has been estimated at about 1 in every 50,000 years.[12] The 2016 annual report by the Global Challenges Foundation estimates the risk that an average American is more than five times likelier to die during a human-extinction event than in a car crash.[13][14]

The relative danger posed by other threats is much more difficult to calculate. In 2008, a small but illustrious group of experts on different global catastrophic risks at the Global Catastrophic Risk Conference at the University of Oxford suggested a 19% chance of human extinction over the next century. However, the conference report cautions that the method used to average responses to the informal survey is suspect due to the treatment of non-responses.[15]

Risk Estimated probability from an expert survey for human extinction before 2100
Overall probability 19%
Molecular nanotechnology weapons 5%
Superintelligent AI 5%
Non-nuclear wars 4%
Engineered pandemic 2%
Nuclear wars 1%
Nanotechnology accident 0.5%
Natural pandemic 0.05%
Nuclear terrorism 0.03%
Table source: Future of Humanity Institute, 2008.[15]

There are significant methodological challenges in estimating these risks with precision. Most attention has been given to risks to human civilization over the next 100 years, but forecasting for this length of time is difficult. The types of threats posed by nature may prove relatively constant, though new risks could be discovered. Anthropogenic threats, however, are likely to change dramatically with the development of new technology; while volcanoes have been a threat throughout history, nuclear weapons have only been an issue since the 20th century. Historically, the ability of experts to predict the future over these timescales has proved very limited. Man-made threats such as nuclear war or nanotechnology are harder to predict than natural threats, due to the inherent methodological difficulties in the social sciences. In general, it is hard to estimate the magnitude of the risk from this or other dangers, especially as both international relations and technology can change rapidly.

Existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects. Unlike with most events, the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has no observers, so regardless of their frequency, no civilization observes existential risks in its history.[4] These anthropic issues can be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.[5]

Fermi paradox

Main article: Fermi paradox

In 1950, the Italian physicist Enrico Fermi wondered why humans had not yet encountered extraterrestrial civilizations. He asked, “Where is everybody?”[16] Given the age of the universe and its vast number of stars, unless the Earth is very atypical, extraterrestrial life should be common. So why was there no evidence of extraterrestrial civilizations? This is known as the Fermi paradox.

One of the many proposed reasons, although not widely accepted, that humans have not yet encountered intelligent life from other planets (aside from the possibility that it does not exist), could be due to the probability of existential catastrophes. Namely, other potentially intelligent civilizations have been wiped out before humans could find them or they could find Earth.[4][17]

Moral importance of existential risk

Some scholars have strongly favored reducing existential risk on the grounds that it greatly benefits future generations. Derek Parfit argues that extinction would be a great loss because our descendants could potentially survive for a billion years before the expansion of the Sun makes the Earth uninhabitable.[18][19] Nick Bostrom argues that there is even greater potential in colonizing space. If future humans colonize space, they may be able to support a very large number of people on other planets, potentially lasting for trillions of years.[7] Therefore, reducing existential risk by even a small amount would have a very significant impact on the expected number of people that will exist in the future.

Little has been written arguing against these positions, but some scholars would disagree. Exponential discounting might make these future benefits much less significant. Gaverick Matheny has argued that such discounting is inappropriate when assessing the value of existential risk reduction.[9]

Some economists have discussed the importance of global catastrophic risks, though not existential risks. Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage.[20] Richard Posner has argued that we are doing far too little, in general, about small, hard-to-estimate risks of large scale catastrophes.[21]

Numerous cognitive biases can influence people's judgement of the importance of existential risks, including scope insensitivity, hyperbolic discounting, availability heuristic, the conjunction fallacy, the affect heuristic, and the overconfidence effect.[22]

Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they are willing to give does not increase linearly with the magnitude of the issue: people are roughly as concerned about 200,000 birds getting stuck in oil as they are about 2,000.[23] Similarly, people are often more concerned about threats to individuals than to larger groups.[24]

There are economic reasons that can explain why little effort is going into existential risk reduction as well. It is a global good, so even if a large nation decreases it, that nation will only enjoy a small fraction of the benefit of doing so. Furthermore, the vast majority of the benefits may be enjoyed by far future generations, and though these quadrillions of future people would be willing to pay massive sums for existential risk reduction, the obvious transaction difficulties prevent them from doing so.[25]

Potential sources of risk

Some sources of catastrophic risk are natural, such as meteor impacts or supervolcanos. The most common concern in this category is global warming or environmental degradation. Some of these have caused mass extinctions in the past.

On the other hand, some risks are man-made, such as engineered pandemics or nuclear war. According to the Future of Humanity Institute, human extinction is more likely to result from anthropogenic causes than natural causes.[6][26]

Anthropogenic

In 2012, Cambridge University created The Cambridge Project for Existential Risk which examines threats to humankind caused by developing technologies.[27] The stated aim is to establish within the University a multidisciplinary research centre, Centre for the Study of Existential Risk, dedicated to the scientific study and mitigation of existential risks of this kind.[27]

The Cambridge Project states that the "greatest threats" to the human species are man-made; they are artificial intelligence, global warming, nuclear war and rogue biotechnology.[28]

Artificial intelligence

It has been suggested that learning computers that rapidly become superintelligent may take unforeseen actions or that robots would out-compete humanity (one technological singularity scenario).[29] Because of its exceptional scheduling and organizational capability and the range of novel technologies it could develop, it is possible that the first Earth superintelligence to emerge could rapidly become matchless and unrivaled: conceivably it would be able to bring about almost any possible outcome, and be able to foil virtually any attempt that threatened to prevent it achieving its objectives.[30] It could eliminate, wiping out if it chose, any other challenging rival intellects; alternatively it might manipulate or persuade them to change their behavior towards its own interests, or it may merely obstruct their attempts at interference.[30] In Bostrom's book, Superintelligence: Paths, Dangers, Strategies, he defines this as the control problem.[31]

Vernor Vinge has suggested that a moment may come when computers and robots are smarter than humans. He calls this "the Singularity."[32] He suggests that it may be somewhat or possibly very dangerous for humans.[33] This is discussed by a philosophy called Singularitarianism.

Physicist Stephen Hawking, Microsoft founder Bill Gates and SpaceX founder Elon Musk have expressed concerns about the possibility that AI could evolve to the point that humans could not control it, with Hawking theorizing that this could "spell the end of the human race".[34] In 2009, experts attended a conference hosted by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss whether computers and robots might be able to acquire any sort of autonomy, and how much these abilities might pose a threat or hazard. They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved "cockroach intelligence." They noted that self-awareness as depicted in science-fiction is probably unlikely, but that there were other potential hazards and pitfalls.[32] Various media sources and scientific groups have noted separate trends in differing areas which might together result in greater robotic functionalities and autonomy, and which pose some inherent concerns.[35][36][37] Eliezer Yudkowsky believes that risks from artificial intelligence are harder to predict than any other known risks. He also argues that research into artificial intelligence is biased by anthropomorphism. Since people base their judgments of artificial intelligence on their own experience, he claims that they underestimate the potential power of AI. He distinguishes between risks due to technical failure of AI, which means that flawed algorithms prevent the AI from carrying out its intended goals, and philosophical failure, which means that the AI is programmed to realize a flawed ideology.[38]

Biotechnology

Main article: Biotechnology risk

Biotechnology can pose a global catastrophic risk in the form of natural pathogens or novel, engineered ones. Such a catastrophe may be brought about by usage in warfare, terrorist attacks or by accident.[39] Terrorist applications of biotechnology have historically been infrequent.[39] To what extent this is due to a lack of capabilities or motivation is not resolved.[39]

Exponential growth has been observed in the biotechnology sector and Noun and Chyba predict that this will lead to major increases in biotechnological capabilities in the coming decades.[39] They argue that risks from biological warfare and bioterrorism are distinct from nuclear and chemical threats because biological pathogens are easier to mass-produce and their production is hard to control (especially as the technological capabilities are becoming available even to individual users).[39]

Given current development, more risk from novel, engineered pathogens is to be expected in the future.[39] It has been hypothesized that there is an upper bound on the virulence (deadliness) of naturally occurring pathogens.[40] But pathogens may be intentionally or unintentionally genetically modified to change virulence and other characteristics.[39] A group of Australian researchers unintentionally changed characteristics of the mousepox virus while trying to develop a virus to sterilize rodents.[39] The modified virus became highly lethal even in vaccinated and naturally resistant mice.[41][42] The technological means to genetically modify virus characteristics are likely to become more widely available in the future if not properly regulated.[39]

Noun and Chyba propose three categories of measures to reduce risks from biotechnology and natural pandemics: Regulation or prevention of potentially dangerous research, improved recognition of outbreaks and developing facilities to mitigate disease outbreaks (e.g. better and/or more widely distributed vaccines).[39]

Global warming

Global warming refers to the warming caused by human technology since the 19th century or earlier. Global warming reflects abnormal variations to the expected climate within the Earth's atmosphere and subsequent effects on other parts of the Earth. Projections of future climate change suggest further global warming, sea level rise, and an increase in the frequency and severity of some extreme weather events and weather-related disasters. Effects of global warming include loss of biodiversity, stresses to existing food-producing systems, increased spread of known infectious diseases such as malaria, and rapid mutation of microorganisms.

It has been suggested that runaway global warming (runaway climate change) might cause Earth to become searingly hot like Venus. In less extreme scenarios, it could cause the end of civilization as we know it.[43]

Ecological disaster

An ecological disaster, such as world crop failure and collapse of ecosystem services, could be induced by the present trends of overpopulation, economic development,[44] and non-sustainable agriculture. Most of these scenarios involve one or more of the following: Holocene extinction event, scarcity of water that could lead to approximately one half of the Earth's population being without safe drinking water, pollinator decline, overfishing, massive deforestation, desertification, climate change, or massive water pollution episodes. A very recent threat in this direction is colony collapse disorder,[45] a phenomenon that might foreshadow the imminent extinction[46] of the Western honeybee. As the bee plays a vital role in pollination, its extinction would severely disrupt the food chain.

Experimental technology accident

Further information: Grey goo and Bioterrorism

Nick Bostrom suggested that in the pursuit of knowledge, humanity might inadvertently create a device that could destroy Earth and the Solar System.[47] Investigations in nuclear and high-energy physics could create unusual conditions with catastrophic consequences. For example, scientists worried that the first nuclear test might ignite the atmosphere.[48][49] More recently, others worried that the RHIC[50] or the Large Hadron Collider might start a chain-reaction global disaster involving black holes, strangelets, or false vacuum states. These particular concerns have been refuted,[51][52][53][54] but the general concern remains.

Biotechnology could lead to the creation of a pandemic, chemical warfare could be taken to an extreme, nanotechnology could lead to grey goo in which out-of-control self-replicating robots consume all living matter on earth while building more of themselves - in both cases, either deliberately or by accident.[55]

Nanotechnology

Many nanoscale technologies are in development or currently in use.[56] The only one that appears to pose a significant global catastrophic risk is molecular manufacturing, a technique that would make it possible to build complex structures at atomic precision.[57] Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories of desktop proportions.[56][57] When nanofactories gain the ability to produce other nanofactories production may only be limited by relatively abundant factors such as input materials, energy and software.[56]

Molecular manufacturing could be used to cheaply produce, among many other products, highly advanced, durable weapons.[56] Being equipped with compact computers and motors these could be increasingly autonomous and have a large range of capabilities.[56]

Phoenix and Treder classify catastrophic risks posed by nanotechnology into three categories:

  1. From augmenting the development of other technologies such as AI and biotechnology.
  2. By enabling mass-production of potentially dangerous products that cause risk dynamics (such as arms races) depending on how they are used.
  3. From uncontrolled self-perpetuating processes with destructive effects.

At the same time, nanotechnology may be used to alleviate several other global catastrophic risks.[56]

Several researchers state that the bulk of risk from nanotechnology comes from the potential to lead to war, arms races and destructive global government.[41][56][58] Several reasons have been suggested why the availability of nanotech weaponry may with significant likelihood lead to unstable arms races (compared to e.g. nuclear arms races):

  1. A large number of players may be tempted to enter the race since the threshold for doing so is low;[56]
  2. The ability to make weapons with molecular manufacturing will be cheap and easy to hide;[56]
  3. Therefore, lack of insight into the other parties' capabilities can tempt players to arm out of caution or to launch preemptive strikes;[56][59]
  4. Molecular manufacturing may reduce dependency on international trade,[56] a potential peace-promoting factor;
  5. Wars of aggression may pose a smaller economic threat to the aggressor since manufacturing is cheap and humans may not be needed on the battlefield.[56]

Since self-regulation by all state and non-state actors seems hard to achieve,[60] measures to mitigate war-related risks have mainly been proposed in the area of international cooperation.[56][61] International infrastructure may be expanded giving more sovereignty to the international level. This could help coordinate efforts for arms control. International institutions dedicated specifically to nanotechnology (perhaps analogously to the International Atomic Energy Agency IAEA) or general arms control may also be designed.[61] One may also jointly make differential technological progress on defensive technologies, a policy that players should usually favour.[56] The Center for Responsible Nanotechnology also suggests some technical restrictions.[62] Improved transparency regarding technological capabilities may be another important facilitator for arms-control.

A grey goo is another catastrophic scenario, which was proposed by Eric Drexler in his 1986 book Engines of Creation[63] and has been a theme in mainstream media and fiction.[64][65] This scenario involves tiny self-replicating robots that consume the entire biosphere using it as a source of energy and building blocks. Nowadays, however, nanotech experts - including Drexler - discredit the scenario. According to Chris Phoenix a "so-called grey goo could only be the product of a deliberate and difficult engineering process, not an accident".[66]

Warfare and mass destruction

Further information: Nuclear holocaust

The scenarios that have been explored most frequently are nuclear warfare and doomsday devices. Although the probability of a nuclear war per year is slim, Professor Martin Hellman, described it as inevitable in the long run; unless the probability approaches zero, inevitably there will come a day when civilization's luck runs out.[67] During the Cuban missile crisis, U.S. President John F. Kennedy estimated the odds of nuclear war as being "somewhere between one out of three and even".[68] The United States and Russia have a combined arsenal of 14,700 nuclear weapons,[69] and there is an estimated total of 15,700 nuclear weapons in existence worldwide.[69]

While popular perception sometimes takes nuclear war as "the end of the world", experts assign low probability to human extinction from nuclear war.[70][71] In 1982, Brian Martin estimated that a US–Soviet nuclear exchange might kill 400–450 million directly, mostly in the United States, Europe and Russia and maybe several hundred million more through follow-up consequences in those same areas.[70]

Nuclear war could yield unprecedented human death tolls and habitat destruction. Detonating such a large amount of nuclear weaponry would have a long-term effect on the climate, causing cold weather and reduced sunlight[72] that may generate significant upheaval in advanced civilizations.[73]

Beyond nuclear, other threats to humanity include biological warfare (BW), bioterrorism, and chemical warfare.

World population and agricultural crisis

The 20th century saw a rapid increase in human population due to medical developments and massive increases in agricultural productivity[74] made by the Green Revolution.[75] Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world grain production increased by 250%. The Green Revolution in agriculture helped food production to keep pace with worldwide population growth or actually enabled population growth. The energy for the Green Revolution was provided by fossil fuels in the form of fertilizers (natural gas), pesticides (oil), and hydrocarbon fueled irrigation.[76] David Pimentel, professor of ecology and agriculture at Cornell University, and Mario Giampietro, senior researcher at the National Research Institute on Food and Nutrition (INRAN), place in their study Food, Land, Population and the U.S. Economy the maximum U.S. population for a sustainable economy at 200 million. To achieve a sustainable economy and avert disaster, the United States must reduce its population by at least one-third, and world population will have to be reduced by two-thirds, says the study.[77]

The authors of this study believe that the mentioned agricultural crisis will begin to impact us after 2020, and will become critical after 2050. Geologist Dale Allen Pfeiffer claims that coming decades could see spiraling food prices without relief and massive starvation on a global level such as never experienced before.[78][79]

Wheat is humanity's 3rd most produced cereal. Extant fungal infections such as Ug99[80] (a kind of stem rust) can cause 100% crop losses in most modern varieties. Little or no treatment is possible and infection spreads on the wind. Should the world's large grain producing areas become infected then there would be a crisis in wheat availability leading to price spikes and shortages in other food products.[81]

Non-anthropogenic

Asteroid impact

Several asteroids have collided with earth in recent geological history. The Chicxulub asteroid, for example, is theorized to have caused the extinction of the non-avian dinosaurs 66 million years ago at the end of the Cretaceous. If such an object struck Earth it could have a serious impact on civilization. It is even possible that humanity would be completely destroyed. For this to occur, the asteroid would need to be at least 1 km (0.62 mi) in diameter, but probably between 3 and 10 km (2–6 miles).[82] Asteroids with a 1 km diameter have impacted the Earth on average once every 500,000 years.[82] Larger asteroids are less common. Small near-Earth asteroids are regularly observed.

In 1.4 million years, the star Gliese 710 is expected to start causing an increase in the number of meteoroids in the vicinity of Earth when it passes within 1.1 light years of the Sun, perturbing the Oort cloud. Dynamic models by García-Sánchez predict a 5% increase in the rate of impact.[83] Objects perturbed from the Oort cloud take millions of years to reach the inner Solar System.

Extraterrestrial invasion

Main article: Alien invasion

Extraterrestrial life could invade Earth[84] either to exterminate and supplant human life, enslave it under a colonial system, steal the planet's resources, or destroy the planet altogether.

Although evidence of alien life has never been documented, scientists such as Carl Sagan have postulated that the existence of extraterrestrial life is very likely. In 1969, the "Extra-Terrestrial Exposure Law" was added to the United States Code of Federal Regulations (Title 14, Section 1211) in response to the possibility of biological contamination resulting from the U.S. Apollo Space Program. It was removed in 1991.[85] Scientists consider such a scenario technically possible, but unlikely.[86]

Climate change

Climate change refers to a lasting change in the Earth's climate. The climate has ranged from ice ages to warmer periods when palm trees grew in Antarctica. It has been hypothesized that there was also a period called "snowball Earth" when all the oceans were covered in a layer of ice. These global climatic changes occurred slowly, prior to the rise of human civilization about 10 thousand years ago near the end of the last Major Ice Age when the climate became more stable. However, abrupt climate change on the decade time scale has occurred regionally. Since civilization originated during a period of stable climate, a natural variation into a new climate regime (colder or hotter) could pose a threat to civilization.

In the history of the Earth, twelve ice ages are known to have occurred. More ice ages will be possible at an interval of 40,000–100,000 years. An ice age would have a serious impact on civilization because vast areas of land (mainly in North America, Europe, and Asia) could become uninhabitable. It would still be possible to live in the tropical regions, but with possible loss of humidity and water. Currently, the world is existing in an interglacial period within a much older glacial event. The last glacial expansion ended about 10,000 years ago, and all civilizations evolved later than this. Scientists do not predict that a natural ice age will occur anytime soon.

Cosmic threats

A number of astronomical threats have been identified. Massive objects, e.g. a star, large planet or black hole, could be catastrophic if a close encounter occurred in the Solar System. In April 2008, it was announced that two simulations of long-term planetary movement, one at Paris Observatory and the other at University of California, Santa Cruz indicate a 1% chance that Mercury's orbit could be made unstable by Jupiter's gravitational pull sometime during the lifespan of the Sun. Were this to happen, the simulations suggest a collision with Earth could be one of four possible outcomes (the others being Mercury colliding with the Sun, colliding with Venus, or being ejected from the Solar System altogether). If Mercury were to collide with Earth, all life on Earth could be obliterated: an asteroid 15 km wide is believed to have caused the extinction of the non-avian dinosaurs, whereas Mercury is 4,879 km in diameter.[87]

Another threat might come from gamma ray bursts.[88] Both threats are very unlikely in the foreseeable future.[89]

A similar threat is a hypernova, produced when a hypergiant star explodes and then collapses, sending vast amounts of radiation sweeping across hundreds of lightyears. Hypernovas have never been observed; however, a hypernova may have been the cause of the Ordovician–Silurian extinction events. The nearest hypergiant is Eta Carinae, approximately 8,000 light-years distant.[90] The hazards from various astrophysical radiation sources were reviewed in 2011.[91]

If the Solar System were to pass through a dark nebula, a cloud of cosmic dust, severe global climate change would occur.[92]

A solar superstorm, which is a drastic and unusual decrease or increase in the Sun's power output, could have severe consequences for life on Earth (see solar flare).

If our universe lies within a false vacuum, a bubble of lower-energy vacuum could come to exist by chance or otherwise in our universe, and catalyze the conversion of our universe to a lower energy state in a volume expanding at nearly the speed of light, destroying all that we know without forewarning.[93] Such an occurrence is called a vacuum metastability event.

Geomagnetic reversal

Main article: Geomagnetic reversal

The magnetic poles of the Earth shifted many times in geologic history. The duration of such a shift is still debated. Theories exist that during such times, the Earth's magnetic field would be substantially weakened, threatening civilization by allowing radiation from the Sun, especially solar wind, solar flares or cosmic radiation, to reach the surface. These theories have been somewhat discredited, as statistical analysis shows no evidence for a correlation between past reversals and past extinctions.[94][95]

Global pandemic

Main article: Pandemic

The death toll for a pandemic is equal to the virulence (deadliness) of the pathogen or pathogens, multiplied by the number of people eventually infected. It has been hypothesised that there is an upper limit to the virulence of naturally evolved pathogens.[40] This is because a pathogen that quickly kills its hosts might not have enough time to spread to new ones, while one that kills its hosts more slowly or not at all will allow carriers more time to spread the infection, and thus likely out-compete a more lethal species or strain.[96] This simple model predicts that if virulence and transmission are not linked in any way, pathogens will evolve towards low virulence and rapid transmission. However, this assumption is not always valid and in more complex models, where the level of virulence and the rate of transmission are related, high levels of virulence can evolve.[97] The level of virulence that is possible is instead limited by the existence of complex populations of hosts, with different susceptibilities to infection, or by some hosts being geographically isolated.[40] The size of the host population and competition between different strains of pathogens can also alter virulence.[98] However, a pathogen that only infects humans as a secondary host and usually infects another species (a zoonosis) may have little constraint on its virulence in people, since infection here is an accidental event and its evolution is driven by events in another species.[99] There are numerous historical examples of pandemics[100] that have had a devastating effect on a large number of people, which makes the possibility of global pandemic a realistic threat to human civilization.

Megatsunami

Main article: Megatsunami

A remote possibility is a megatsunami. It has been suggested that a megatsunami caused by the collapse of a volcanic island could, for example, destroy the entire East Coast of the United States, but such predictions are based on incorrect assumptions and the likelihood of this happening has been greatly exaggerated in the media.[101] While none of these scenarios are likely to destroy humanity completely, they could regionally threaten civilization. There have been two recent high-fatality tsunamis—after the 2011 Tōhoku earthquake and the 2004 Indian Ocean earthquake—although they were not large enough to be considered megatsunamis. A megatsunami could have astronomical origins as well, such as an asteroid impact in an ocean.[102]

Volcanism

Main article: Supervolcano

A geological event such as massive flood basalt, volcanism, or the eruption of a supervolcano[103] leading to a so-called volcanic winter, similar to a nuclear winter. One such event, the Toba eruption,[104] occurred in Indonesia about 71,500 years ago. According to the Toba catastrophe theory,[105] the event may have reduced human populations to only a few tens of thousands of individuals. Yellowstone Caldera is another such supervolcano, having undergone 142 or more caldera-forming eruptions in the past 17 million years.[106] A massive volcano eruption would eject extraordinary volumes of volcanic dust, toxic and greenhouse gases into the atmosphere with serious effects on global climate (towards extreme global cooling; volcanic winter if short term, and ice age if long term) or global warming (if greenhouse gases were to prevail).

When the supervolcano at Yellowstone last erupted 640,000 years ago, the magma and ash ejected from the caldera covered most of the United States west of the Mississippi river and part of northeastern Mexico.[107] Another such eruption could threaten civilization.

Such an eruption could also release large volumes of gases, increase carbon dioxide concentrations and cause a runaway greenhouse effect, or enough pyroclastic debris and other material might be thrown into the atmosphere to partially block out the sun and cause a volcanic winter, as happened on a smaller scale in 1816 following the eruption of Mount Tambora, the so-called Year Without a Summer. Such an eruption might cause the immediate deaths of millions of people several hundred miles from the eruption, and perhaps billions of deaths[108] worldwide, due to the failure of the monsoon, resulting in major crop failures causing starvation on a massive scale.[108]

A much more speculative concept is the Verneshot: a hypothetical volcanic eruption caused by the buildup of gas deep underneath a craton. Such an event may be forceful enough to launch an extreme amount of material from the crust and mantle into a sub-orbital trajectory.

Precautions and prevention

Planetary management and respecting planetary boundaries have been proposed as approaches to preventing ecological catastrophes. Within the scope of these approaches, the field of geoengineering encompasses the deliberate large-scale engineering and manipulation of the planetary environment to combat or counteract anthropogenic changes in atmospheric chemistry. Space colonization is a proposed alternative to improve the odds of surviving an extinction scenario.[109] Solutions of this scope may require megascale engineering. Food storage has been proposed globally, but the monetary cost would be high. Furthermore, it would likely contribute to the current millions of deaths per year due to malnutrition. David Denkenberger and Joshua Pearce have proposed in Feeding Everyone No Matter What a variety of alternate foods for global catastrophic risks such as nuclear winter, volcanic winter, asteroid/comet impact, and abrupt climate change.[110] The alternate foods convert fossil fuels or biomass (e.g. trees and wood) into food.[111] However, significantly more research is needed in this field to make it viable for the entire global population to survive using these methods.[112] Asteroid deflection has been proposed to reduce impact risk. Nuclear disarmament has been proposed to reduce the nuclear winter risk. Precautions being taken include:

Organizations

The Bulletin of the Atomic Scientists (est. 1945) is one of the oldest global risk organizations, founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII. It studies risks associated with nuclear war and energy and famously maintains the Doomsday Clock established in 1947. The Foresight Institute (est. 1986) examines the risks of nanotechnology and its benefits. It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale. It was founded by K. Eric Drexler who postulated "grey goo".[114][115]

Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia.[116]

Independent non-governmental organizations (NGOs) include the Machine Intelligence Research Institute (est. 2000) which aims to reduce the risk of a catastrophe caused by artificial intelligence and the Singularity.[117] The top donors include Peter Thiel and Jed McCaleb.[118] The Lifeboat Foundation (est. 2009) funds research into preventing a technological catastrophe.[119] Most of the research money funds projects at universities.[120] The Global Catastrophic Risk Institute (est. 2011) is a think tank for all things catastrophic risk. It is funded by the NGO Social and Environmental Entrepreneurs. The Global Challenges Foundation (est. 2012), based in Stockholm and founded by Laszlo Szombatfalvy, releases a year report on the state of global risks.[13][14] The Future of Life Institute (est. 2014) aims to support research and initiatives for safeguarding life considering new technologies and challenges facing humanity.[121] Elon Musk is one of its biggest donors.[122] The Nuclear Threat Initiative seeks to reduce global threats from nuclear, biological and chemical threats, and containment of damage after an event.[123] It maintains a nuclear material security index.[124]

University-based organizations include the Future of Humanity Institute (est. 2005) which researches the questions of humanity's long-term future, particularly existential risk. It was founded by Nick Bostrom and is based at Oxford University. The Centre for the Study of Existential Risk (est. 2012) is a Cambridge-based organization which studies four major technological risks: artificial intelligence, biotechnology, global warming and warfare. All are man-made risks, as Huw Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us." [125] Stephen Hawking is an acting adviser. The Millennium Alliance for Humanity & The Biosphere is a Stanford University-based organization focusing on many issues related to global catastrophe by bringing together members of academic in the humanities.[126][127] It was founded by Paul Ehrlich among others.[128] Stanford University also has the Center for International Security and Cooperation focusing on political cooperation to reduce global catastrophic risk.[129]

Other risk assessment groups are based in or are part of governmental organizations. The World Health Organization (WHO) includes a division called the Global Alert and Response (GAR) which monitors and responds to global epidemic crisis.[130] GAR helps member states with training and coordination of response to epidemics.[131] The United States Agency for International Development (USAID) has its Emerging Pandemic Threats Program which aims to prevent and contain naturally generated pandemics at their source.[132] The Lawrence Livermore National Laboratory has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio-security, counter-terrorism, etc.[133]

See also

Notes

  1. Schulte, P.; et al. (5 March 2010). "The Chicxulub Asteroid Impact and Mass Extinction at the Cretaceous-Paleogene Boundary". Science 327 (5970): 1214–1218. Bibcode:2010Sci...327.1214S. doi:10.1126/science.1177265. PMID 20203042.
  2. Bostrom, Nick (2008). Global Catastrophic Risks (PDF). Oxford University Press. p. 1.
  3. 1 2 Bostrom, Nick (March 2002). "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards". Journal of Evolution and Technology 9.
  4. 1 2 3 "Observation Selection Effects and Global Catastrophic Risks", Milan Cirkovic, 2008
  5. 1 2 Bostrom, N (2013). "Existential Risk Prevention as Global Priority" (PDF). Global Policy 4: 15–31. doi:10.1111/1758-5899.12002. Retrieved 2014-07-07.
  6. 1 2 Bostrom, Nick. "Existential Risk Prevention as a Global Priority". Existential Risk. Future of Humanity Institute. Retrieved 23 July 2013.
  7. 1 2 Bostrom, Nick. "Astronomical Waste: The opportunity cost of delayed technological development". Utilitas 15 (3): 308–314. doi:10.1017/s0953820800004076.
  8. Posner, Richard A. (2006). Catastrophe : risk and response. Oxford: Oxford University Press. ISBN 978-0195306477., Introduction, "What is Catastrophe?"
  9. 1 2 Matheny, Jason Gaverick (2007). "Reducing the Risk of Human Extinction" (PDF). Risk Analysis 27 (5): 1335–1344. doi:10.1111/j.1539-6924.2007.00960.x. PMID 18076500.
  10. Asher, D.J.; Bailey, M.E.; Emel'yanenko, V.; Napier, W.M. (2005). "Earth in the cosmic shooting gallery" (PDF). The Observatory 125: 319–322. Bibcode:2005Obs...125..319A.
  11. Ambrose 1998; Rampino & Ambrose 2000, pp. 71, 80.
  12. Rampino, M.R.; Ambrose, S.H. (2002). "Super eruptions as a threat to civilizations on Earth-like planets" (PDF). Icarus 156: 562–569. Bibcode:2002Icar..156..562R. doi:10.1006/icar.2001.6808.
  13. 1 2 Robinson Meyer (April 29, 2016). "Human Extinction Isn't That Unlikely". The Atlantic. Retrieved April 30, 2016.
  14. 1 2 "Global Challenges Foundation website". globalchallenges.org. Retrieved April 30, 2016.
  15. 1 2 Global Catastrophic Risks Survey, Technical Report, 2008, Future of Humanity Institute
  16. Jones, E. M. (March 1, 1985). "'Where is everybody?' An account of Fermi's question". Los Alamos National Laboratory (LANL), United States Department of Energy. Retrieved January 12, 2013.
  17. Ventrudo, Brian (5 June 2009). "So Where Is ET, Anyway?". Universe Today. Retrieved 10 March 2014. Some believe [the Fermi Paradox] means advanced extraterrestrial societies are rare or nonexistent. Others suggest they must destroy themselves before they move on to the stars.
  18. Parfit, Derek (1984). Reasons and Persons. Oxford University Press. pp. 453–454.
  19. Carrington, Damian (21 February 2000). "Date set for desert Earth". BBC News Online.
  20. Weitzman, Martin (2009). "On modeling and interpreting the economics of catastrophic climate change" (PDF). The Review of Economics and Statistics 91 (1): 1–19. doi:10.1162/rest.91.1.1.
  21. Posner, Richard (2004). Catastrophe: Risk and Response. Oxford University Press.
  22. https://www.stat.berkeley.edu/~aldous/157/Papers/yudkowsky.pdf
  23. Desvousges, W.H., Johnson, F.R., Dunford, R.W., Boyle, K.J., Hudson, S.P., and Wilson, N. 1993, Measuring natural resource damages with contingent valuation: tests of validity and reliability. In Hausman, J.A. (ed), Contingent Valuation:A Critical Assessment, pp. 91−159 (Amsterdam: North Holland).
  24. Eliezer Yudkowsky, 2008, "Cognitive Biases potentially affecting judgments of global risks"
  25. http://www.existential-risk.org/concept.pdf
  26. "Frequently Asked Questions". Existential Risk. Future of Humanity Institute. Retrieved 26 July 2013.
  27. 1 2 "The Cambridge Project for Existential Risk". Cambridge University.
  28. "'Terminator center' to open at Cambridge University". Fox News. 2012-11-26.
  29. Bill Joy, Why the future doesn't need us. Wired magazine.
  30. 1 2 Nick Bostrom 2002 "Ethical Issues in Advanced Artificial Intelligence"
  31. Bostrom, Nick. Superintelligence: Paths, Dangers, Strategies.
  32. 1 2 Scientists Worry Machines May Outsmart Man By JOHN MARKOFF, NY Times, July 26, 2009.
  33. The Coming Technological Singularity: How to Survive in the Post-Human Era, by Vernor Vinge, Department of Mathematical Sciences, San Diego State University, (c) 1993 by Vernor Vinge.
  34. Rawlinson, Kevin. "Microsoft's Bill Gates insists AI is a threat". BBC News. Retrieved 30 January 2015.
  35. Gaming the Robot Revolution: A military technology expert weighs in on Terminator: Salvation., By P. W. Singer, slate.com Thursday, May 21, 2009.
  36. Robot takeover, gyre.org.
  37. robot page, engadget.com.
  38. Yudkowsky, Eliezer. "Artificial Intelligence as a Positive and Negative Factor in Global Risk". Retrieved 26 July 2013.
  39. 1 2 3 4 5 6 7 8 9 10 Ali Noun; Christopher F. Chyba (2008). "Chapter 20: Biotechnology and biosecurity". In Bostrom, Nick; Cirkovic, Milan M. Global Catastrophic Risks. Oxford University Press.
  40. 1 2 3 Frank SA (March 1996). "Models of parasite virulence" (PDF). Q Rev Biol 71 (1): 37–78. doi:10.1086/419267. PMID 8919665.
  41. 1 2 Sandberg, Anders. "The five biggest threats to human existence". theconversation.com. Retrieved 13 July 2014.
  42. Jackson, Ronald J.; Ramsay, Alistair J.; Christensen, Carina D.; Beaton, Sandra; Hall, Diana F.; Ramshaw, Ian A. (2001). "Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance to Mousepox". Journal of Virology 75 (3): 1205–1210. doi:10.1128/jvi.75.3.1205-1210.2001. Retrieved 13 July 2014.
  43. Isaac M. Held, Brian J. Soden, "Water Vapor Feedback and Global Warming", In: Annu. Rev. Energy Environ 2000. Page 449.
  44. Chiarelli, B. (1998). "Overpopulation and the Threat of Ecological Disaster: the Need for Global Bioethics". Mankind Quarterly 39 (2): 225–230.
  45. Evans-Pritchard, Ambrose (6 February 2011). "Einstein was right - honey bee collapse threatens global food security". The Daily Telegraph (London).
  46. Lovgren, Stefan. "Mystery Bee Disappearances Sweeping U.S." National Geographic News. URL accessed March 10, 2007.
  47. Bostrom 2002, section 4.8
  48. Richard Hamming. "Mathematics on a Distant Planet".
  49. "Report LA-602, ''Ignition of the Atmosphere With Nuclear Bombs''" (PDF). Retrieved 2011-10-19.
  50. New Scientist, 28 August 1999: "A Black Hole Ate My Planet"
  51. Konopinski, E. J; Marvin, C.; Teller, Edward (1946). "Ignition of the Atmosphere with Nuclear Bombs" (PDF) (Declassified February 1973) (LA–602). Los Alamos National Laboratory. Retrieved 23 November 2008.
  52. "Statement by the Executive Committee of the DPF on the Safety of Collisions at the Large Hadron Collider."
  53. "Safety at the LHC".
  54. J. Blaizot et al., "Study of Potentially Dangerous Events During Heavy-Ion Collisions at the LHC", CERN library record CERN Yellow Reports Server (PDF)
  55. Eric Drexler, Engines of Creation, ISBN 0-385-19973-2, available online
  56. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Chris Phoenix; Mike Treder (2008). "Chapter 21: Nanotechnology as global catastrophic risk". In Bostrom, Nick; Cirkovic, Milan M. Global catastrophic risks. Oxford: Oxford University Press. ISBN 978-0-19-857050-9.
  57. 1 2 "Frequently Asked Questions - Molecular Manufacturing". foresight.org. Retrieved 19 July 2014.
  58. Drexler, Eric. "A Dialog on Dangers". foresight.org. Retrieved 19 July 2014.
  59. Drexler, Eric. "ENGINES OF DESTRUCTION (Chapter 11)". e-drexler.com. Retrieved 19 July 2014.
  60. "Dangers of Molecular Manufacturing". crnano.org. Retrieved 19 July 2014.
  61. 1 2 "The Need for International Control". crnano.org. Retrieved 19 July 2014.
  62. "Technical Restrictions May Make Nanotechnology Safer". crnano.org. Retrieved 19 July 2014.
  63. Joseph, Lawrence E. (2007). Apocalypse 2012. New York: Broadway. p. 6. ISBN 978-0-7679-2448-1.
  64. Rincon, Paul (2004-06-09). "Nanotech guru turns back on 'goo'". BBC News. Retrieved 2012-03-30.
  65. Hapgood, Fred (November 1986). "Nanotechnology: Molecular Machines that Mimic Life" (PDF). Omni. Retrieved 19 July 2014.
  66. "Leading nanotech experts put 'grey goo' in perspective". crnano.org. Retrieved 19 July 2014.
  67. "On the Probability of Nuclear War" by Martin E. Hellman
  68. Nuclear Weapons and the Future of Humanity: The Fundamental Questions by Avner Cohen, Steven Lee, p. 237, at Google Books
  69. 1 2 Federation of American Scientists (28 April 2015). "Status of World Nuclear Forces". Federation of American Scientists. Retrieved 4 June 2015.
  70. 1 2 Martin, Brian (1982). "Critique of nuclear extinction". Journal of Peace Research 19 (4): 287–300. doi:10.1177/002234338201900401. Retrieved 25 October 2014.
  71. Shulman, Carl (5 Nov 2012). "Nuclear winter and human extinction: Q&A with Luke Oman". Overcoming Bias. Retrieved 25 October 2014.
  72. "Atmospheric effects and societal consequences of regional scale nuclear conflicts and acts of individual nuclear terrorism", Atmospheric Chemistry and Physics
  73. Bostrom 2002, section 4.2.
  74. "The end of India's green revolution?". BBC News. 2006-05-29. Retrieved 2012-01-31.
  75. April 8th, 2000 by admin (2000-04-08). "Food First/Institute for Food and Development Policy". Foodfirst.org. Retrieved 2012-01-31.
  76. "How peak oil could lead to starvation". Web.archive.org. 2009-05-27. Archived from the original on May 27, 2009. Retrieved 2012-01-31.
  77. "Eating Fossil Fuels". EnergyBulletin.net. 2003-10-02. Retrieved 2012-01-31.
  78. The Oil Drum: Europe. "Agriculture Meets Peak Oil". Europe.theoildrum.com. Retrieved 2012-01-31.
  79. "Drawing Momentum from the Crash" by Dale Allen Pfeiffer
  80. "Cereal Disease Laboratory : Ug99 an emerging virulent stem rust race". Ars.usda.gov. Retrieved 2012-01-31.
  81. "Durable Rust Resistance in Wheat". Wheatrust.cornell.edu. Retrieved 2012-01-31.
  82. 1 2 Bostrom 2002, section 4.10
  83. García-Sánchez, Joan; et al. (February 1999). "Stellar Encounters with the Oort Cloud Based on HIPPARCOS Data". The Astronomical Journal 117 (2): 1042–1055. Bibcode:1999AJ....117.1042G. doi:10.1086/300723.
  84. Twenty ways the world could end suddenly, Discover Magazine
  85. Urban Legends Reference Pages: Legal Affairs (E.T. Make Bail)
  86. Bostrom 2002, section 7.2
  87. Ken Croswell, Will Mercury Hit Earth Someday?, Skyandtelescope.com April 24, 2008, accessed April 26, 2008
  88. Explosions in Space May Have Initiated Ancient Extinction on Earth, NASA.
  89. Bostrom 2002, section 4.7
  90. Wanjek, Christopher (2005-04-06). "Explosions in Space May Have Initiated Ancient Extinction on Earth". NASA.
  91. Melott, A.L. and Thomas, B.C. (2011). "Astrophysical Ionizing Radiation and the Earth: A Brief Review and Census of Intermittent Intense Sources" (PDF). Astrobiology 11: 343–361. arXiv:1102.2830. Bibcode:2011AsBio..11..343M. doi:10.1089/ast.2010.0603.
  92. Fraser Cain (2003-08-04). "Local Galactic Dust is on the Rise". Universe Today.
  93. Coleman, Sidney; De Luccia, Frank (1980-06-15). "Gravitational effects on and of vacuum decay" (PDF). Physical Review D D21 (12): 3305–3315. Bibcode:1980PhRvD..21.3305C. doi:10.1103/PhysRevD.21.3305.
  94. Plotnick, Roy E. (1 January 1980). "Relationship between biological extinctions and geomagnetic reversals". Geology 8 (12): 578. Bibcode:1980Geo.....8..578P. doi:10.1130/0091-7613(1980)8<578:RBBEAG>2.0.CO;2.
  95. Glassmeier, Karl-Heinz; Vogt, Joachim (29 May 2010). "Magnetic Polarity Transitions and Biospheric Effects". Space Science Reviews 155 (1-4): 387–410. Bibcode:2010SSRv..155..387G. doi:10.1007/s11214-010-9659-6.
  96. Brown NF, Wickham ME, Coombes BK, Finlay BB (May 2006). "Crossing the Line: Selection and Evolution of Virulence Traits". PLoS Pathogens 2 (5): e42. doi:10.1371/journal.ppat.0020042. PMC 1464392. PMID 16733541.
  97. Ebert D, Bull JJ (January 2003). "Challenging the trade-off model for the evolution of virulence: is virulence management feasible?". Trends Microbiol. 11 (1): 15–20. doi:10.1016/S0966-842X(02)00003-3. PMID 12526850.
  98. André JB, Hochberg ME (July 2005). "Virulence evolution in emerging infectious diseases". Evolution 59 (7): 1406–12. doi:10.1554/05-111. PMID 16153027.
  99. Gandon S (March 2004). "Evolution of multihost parasites". Evolution 58 (3): 455–69. doi:10.1111/j.0014-3820.2004.tb01669.x. PMID 15119430.
  100. "Near Apocalypse Causing Diseases, a Historical Look:". postapocalypticsurvival.com. Retrieved 2012-05-05.
  101. Pararas-Carayannis, George (2002). "Evaluation of the threat of mega tsunami generation from postulated massive slope failures of island volcanoes on La Palma, Canary Islands, and on the island of Hawaii". drgeorgepc.com. Retrieved 2008-12-20.
  102. Prehistoric Asteroid "Killed Everything"
  103. Kate Ravilious (2005-04-14). "What a way to go". The Guardian.
  104. 2012 Admin (2008-02-04). "Toba Supervolcano". 2012 Final Fantasy.
  105. Science Reference. "Toba Catastrophe Theory". Science Daily.
  106. Breining, Greg (2007). Super Volcano: The Ticking Time Bomb Beneath Yellowstone National Park. Voyageur Press. p. 256. ISBN 978-0-7603-2925-2.
  107. Breining, Greg (2007). "Distant Death". Super Volcano: The Ticking Time Bomb Beneath Yellowstone National Park. St. Paul, MN.: Voyageur Press. p. 256 pg. ISBN 978-0-7603-2925-2.
  108. 1 2 Breining, Greg (2007). "The Next Big Blast". Super Volcano: The Ticking Time Bomb Beneath Yellowstone National Park. St. Paul, MN.: Voyageur Press. p. 256 pg. ISBN 978-0-7603-2925-2.
  109. "Mankind must abandon earth or face extinction: Hawking", physorg.com, August 9, 2010, retrieved 2012-01-23
  110. D.C. Denkenberger and J. M. Pearce. Feeding Everyone No Matter What: Managing Food Security After Global Catastrophe, Elsevier, San Francisco, 2014.
  111. Denkenberger, D. C., & Pearce, J. M. (2015). Feeding Everyone: Solving the Food Crisis in Event of Global Catastrophes that Kill Crops or Obscure the Sun. Futures. 72:57–68. open access
  112. Baum, S.D., Denkenberger, D.C., A Pearce, J.M., Robock, A., Winkler, R. Resilience to global food supply catastrophes. Environment, Systems and Decisions 35(2), pp 301-313 (2015). open access
  113. Lewis Smith (2008-02-27). "Doomsday vault for world’s seeds is opened under Arctic mountain". London: The Times Online.
  114. Fred Hapgood (November 1986). "Nanotechnology: Molecular Machines that Mimic Life" (PDF). Omni. Retrieved June 5, 2015.
  115. Giles, Jim (2004). "Nanotech takes small step towards burying 'grey goo'". Nature 429 (6992): 591. Bibcode:2004Natur.429..591G. doi:10.1038/429591b. PMID 15190320.
  116. Sophie McBain (September 25, 2014). "Apocalypse soon: the scientists preparing for the end times". New Statesman. Retrieved June 5, 2015.
  117. "Reducing Long-Term Catastrophic Risks from Artificial Intelligence". Machine Intelligence Research Institute. Retrieved June 5, 2015. The Machine Intelligence Research Institute aims to reduce the risk of a catastrophe, should such an event eventually occur.
  118. Angela Chen (September 11, 2014). "Is Artificial Intelligence a Threat?". The Chronicle of Higher Education. Retrieved June 5, 2015.
  119. "About the Lifeboat Foundation". The Lifeboat Foundation. Retrieved 26 April 2013.
  120. Ashlee Vance (July 20, 2010). "The Lifeboat Foundation: Battling Asteroids, Nanobots and A.I.". New York Times. Retrieved June 5, 2015.
  121. "The Future of Life Institute". Retrieved May 5, 2014.
  122. Nick Bilton (May 28, 2015). "Ava of ‘Ex Machina’ Is Just Sci-Fi (for Now)". New York Times. Retrieved June 5, 2015.
  123. "Nuclear Threat Initiative". Nuclear Threat Initiative. Retrieved June 5, 2015.
  124. Alexander Sehmar (May 31, 2015). "Isis could obtain nuclear weapon from Pakistan, warns India". The Independent. Retrieved June 5, 2015.
  125. Hui, Sylvia (25 November 2012). "Cambridge to study technology's risks to humans". Associated Press. Retrieved 30 January 2012.
  126. Scott Barrett (2014). Environment and Development Economics: Essays in Honour of Sir Partha Dasgupta. Oxford University Press. p. 112. Retrieved June 5, 2015.
  127. "Millennium Alliance for Humanity & The Biosphere". Millennium Alliance for Humanity & The Biosphere. Retrieved June 5, 2015.
  128. Guruprasad Madhavan (2012). Practicing Sustainability. Springer Science & Business Media. p. 43. Retrieved June 5, 2015.
  129. "Center for International Security and Cooperation". Center for International Security and Cooperation. Retrieved June 5, 2015.
  130. "Global Alert and Response (GAR)". World Health Organization. Retrieved June 5, 2015.
  131. Kelley Lee (2013). Historical Dictionary of the World Health Organization. Rowman & Littlefield. p. 92. Retrieved June 5, 2015.
  132. "USAID Emerging Pandemic Threats Program". USAID. Retrieved June 5, 2015.
  133. "Global Security". Lawrence Livermore National Laboratory. Retrieved June 5, 2015.

References

Further reading

External links

This article is issued from Wikipedia - version of the Tuesday, May 03, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.