GCR Concept Project

PROJECT LEAD: Seth Baum

What is global catastrophic risk? Why is GCR important? Which GCRs are most important? How does GCR differ from other, related concepts? As leaders in the GCR field, GCRI works on the conceptual foundations of GCR.

In simplest terms, GCRs are risks of events that could significantly harm or even destroy human civilization at the global scale. Exactly what qualifies as a global catastrophe is a matter of ongoing debate. For example, global catastrophes have been defined as events in which more than one quarter of the human population dies [1], events that cause at least 10 million deaths or $10 trillion in damages [2], or events that would permanently eliminate human civilization’s capacity to colonize space and thus sustain human life beyond the existence of Earth [3]. Whatever the specifics, it is clear that global catastrophes are risks of the highest magnitude, the worst-case scenarios for humanity.

GCRI supports open debate on the definition of GCR. However, we tend to think of GCR as the risk of events that would cause large harm to human civilization, and we tend to focus on GCRs that would cause large and permanent harm to human civilization. This is roughly synonymous with some usage of the term existential risk (“X-Risk”) [4]. Indeed, we often find ourselves using GCR and X-Risk interchangeably. Fortunately, the GCR/X-Risk community has avoided heated debates about which word is better, instead focusing our energies on the risks we all believe are extremely important.

GCRI research has produced several advances in the GCR concept. One advance is the development of the concepts of adaptation to and recovery from global catastrophe [5]. Suppose there is an initial global catastrophe that causes major harm but does not immediately kill everyone. The catastrophe will leave dramatically different environmental and social conditions for the survivors to cope with. The survivors must adapt to these new conditions in order to stay alive. Additionally, the survivors must recover the sort of advanced civilization that now exists in order to achieve certain great things, such as the colonization of outer space. (Space colonization would be a great thing because it would open up astronomically large opportunities for civilization [6].) We analyze global catastrophe adaptation and recovery in GCRI’s aftermath of global catastrophes project.

Adapt Recover

Another conceptual advance coming from GCRI research is a definition of GCR in terms of resilience. Resilience is the ability of a system to withstand disturbances while retaining its essential form and function. GCRI’s work builds on earlier research that proposed the global environmental system has thresholds that would be catastrophic to cross. That research further proposed “planetary boundaries” as safe boundaries set by humanity to avoid crossing the global environmental system thresholds [7]. GCRI research likewise proposed that global catastrophes can be defined as disturbances that exceed the resilience of the global human system. We can in that way think of there being boundaries it would not be safe for to the global human system to cross [8]. This conceptual advance paves the way for assessing the resilience of the global human system and identifying how to ensure that the resilience is never exceeded.

References

[1] Austen Atkinson, 1999. Impact Earth: Asteroids, Comets and Meteors—The Growing Threat. London: Virgin. Cited in Hempsell, C.M., 2004. The Potential for Space Intervention in Global Catastrophes. Journal of the British Interplanetary Society 57, 14-21.

[2] More precisely, global catastrophes are defined in this reference as events causing at least X deaths or Y monetary damages, with X being some unspecified point between 10 thousand and 10 million, and Y being some unspecified point between $10 billion and $10 trillion. The reference is: Nick Bostrom and Milan Ćirković, 2008. Introduction. In Bostrom and Ćirković (eds), Global Catastrophic Risks. Oxford: Oxford University Press.

[3] Seth D. Baum, 2010. Is humanity doomed? Insights from astrobiology. Sustainability 2(2), 591-603.

[4] Nick Bostrom, 2002. Existential risks: Analyzing human extinction scenarios and related hazards. Journal of Evolution and Technology, vol. 9. Nick Bostrom, 2013. Existential risk prevention as global priority. Global Policy 4(1), 15-31.

[5] Timothy M. Maher, Jr. and Seth D. Baum, 2013. Adaptation to and recovery from global catastrophe. Sustainability 5(4), 1461-1479.

[6] Isaac Asimov, 1981. A Choice of Catastrophes: The Disasters That Threaten Our World. New York: Ballantine Books. Milan M. Ćirković, 2002. Cosmological forecast and its practical significance. Journal of Evolution and Technology, 12.

[7] Rockström, J., Steffen, W., Noone, K., Persson, Å., Chapin, F.S. III, Lambin, E., et al., 2009. Planetary boundaries: Exploring the safe operating space for humanity. Ecology and Society 14(2), 32.

[8] Seth D. Baum and Itsuki C. Handoh, 2014. Integrating the planetary boundaries and global catastrophic risk paradigms. Ecological Economics 107, 13-21.