Risk & Decision Analysis

GCRI pioneers the application of risk and decision analysis to the study of global catastrophic risk.

Risk is commonly quantified as the probability of an adverse event multiplied by the severity of the event if it occurs. The field of risk analysis is largely dedicated to characterizing and quantifying potential adverse events in terms of their probability and severity. The field of decision analysis is dedicated to the formal evaluation and optimization of decision options in terms of risk and other (often quantitative) criteria. Risk and decision analysis can be highly effective tools for informing and improving the quality of decision making, ultimately serving to make the world safer and better. However, the complex, extreme, and unprecedented nature of global catastrophe makes it a difficult subject for risk and decision analysis. Careful risk analysis can nonetheless be an important source of information about global catastrophic risk.

GCRI evaluates the applicability of risk and decision analysis to global catastrophic risk and conducts select risk and decision analyses. We draw on a deep experience with the field of professional risk analysis, applying classic methodologies such as fault trees, event trees, and expert elicitation. We are especially attentive to the challenge of rigorously quantifying global catastrophic risks and the uncertainties about the risks. Some argue for quantifying everything, while others argue against quantifying anything that is so uncertain. We take an intermediate approach, being willing to quantify but only when there is a sufficiently rigorous basis for doing so. We have applied risk and decision analysis mainly to risks from artificial intelligence and nuclear war.

For further discussion, please see our blog post The Role of Risk and Decision Analysis in Global Catastrophic Risk Reduction.

Featured Publications

Global catastrophes: The most extreme risks

Seth D. Baum and Anthony M. Barrett. In Vicki Bier (Editor), Risk in Extreme Environments: Preparing, Avoiding, Mitigating, and Managing. New York: Routledge, 2018, pages 174-184.

This paper describes fundamental challenges in the analysis of global catastrophic risks. The challenges include extremely large and possibly infinite severity, the lack of historical precedent and empirical data, and the complex and interconnected nature of many global catastrophic risks.

Value of GCR information: Cost effectiveness-based approach for global catastrophic risk (GCR) reduction

Anthony M. Barrett. Decision Analysis 14(3), 2017, pp. 187-203, DOI 10.1287/deca.2017.0350.

This paper uses a value of information framework to assess the value of research to reduce global catastrophic risk. Given the very large stakes of global catastrophe, the paper uses cost-effectiveness instead of the usual benefit-based VoI framework. The paper presents examples involving asteroids and nuclear war.

Risk analysis and risk management for the artificial superintelligence research and development process

Anthony M. Barrett and Seth D. Baum. In James Miller, Vic Callaghan, Roman Yampolskiy, and Stuart Armstrong (Editors), The Technological Singularity: A Pragmatic Perspective. Berlin: Springer, 2017, pages 127-140.

This paper outlines risk and decision analysis methodologies that can be applied to the study of artificial superintelligence. The methodologies include fault trees and event trees, expert elicitation, and value of information.

Uncertain human consequences in asteroid risk analysis and the global catastrophe threshold

Seth D. Baum. Natural Hazards, vol. 94, no. 2 (November), 2018, pages 759-775, DOI 10.1007/s11069-018-3419-4.

This paper critiques the treatment of human consequences in the literature on asteroid risk. Asteroids are an important case study because they are arguably the most well-understood global catastrophic risk. However, this paper shows that the human consequences are not well understood, especially for the more catastrophic collisions. The paper shows why the aftermath of global catastrophe is a large and important point of uncertainty in the study of global catastrophic risk.

Additional Publications

Baum, Seth D. Assessing the risk of takeover catastrophe from large language models. Risk Analysis, forthcoming, DOI:10.1111/risa.14353.

Baum, Seth D., 2024. Climate change, uncertainty, and global catastrophic risk. Futures, vol. 162 (September), article 103432, DOI 10.1016/j.futures.2024.103432.

Baum, Seth D., 2019. Risk-risk tradeoff analysis of nuclear explosives for asteroid deflection. Risk Analysis, vol. 39, no. 11 (November), pages 2427-2442. DOI 10.1111/risa.13339.

Baum, Seth D., 2019. The challenge of analyzing global catastrophic risks. Decision Analysis Today, vol. 38, no. 1 (July), pages 20-24.

Steven Umbrello and Seth D. Baum, 2018. Evaluating future nanotechnology: The net societal impacts of atomically precise manufacturingFutures, vol. 100 (June), pages 63–73. DOI 10.1016/j.futures.2018.04.007.

Seth D. Baum, Robert de Neufville, and Anthony M. Barrett, 2018. A model for the probability of nuclear war. Global Catastrophic Risk Institute Working Paper 18-1.

Seth D. Baum and Anthony M. Barrett, 2018. A model for the impacts of nuclear war. Global Catastrophic Risk Institute Working Paper 18-2.

Anthony M. Barrett and Seth D. Baum, 2017. A model of pathways to artificial superintelligence catastrophe for risk and decision analysisJournal of Experimental & Theoretical Artificial Intelligence, vol. 29, no. 2, pages 397-414, DOI 10.1080/0952813X.2016.1186228.

Seth D. Baum and Anthony M. Barrett, 2017. Towards an integrated assessment of global catastrophic risk. In B.J. Garrick (editor), Proceedings of the First Colloquium on Catastrophic and Existential Risk, Garrick Institute for the Risk Sciences, University of California, Los Angeles, pages 41-62.

Seth D. Baum, Anthony M. Barrett and Roman V. Yampolskiy, 2017. Modeling and interpreting expert disagreement about artificial superintelligenceInformatica, vol. 41, no. 7 (December), pages 419-428.

Anthony M. Barrett, 2016. False alarms, true dangers? Current and future risks of inadvertent U.S.-Russian nuclear war. RAND Corporation, document PE-191-TSF, DOI 10.7249/PE191.

Seth D. Baum, 2015. Risk and resilience for unknown, unquantifiable, systemic, and unlikely/catastrophic threatsEnvironment, Systems, and Decisions, vol. 35, no. 2 (June), pages 229-236, DOI 10.1007/s10669-015-9551-8.

Seth D. Baum, 2014. The great downside dilemma for risky emerging technologiesPhysica Scripta, vol. 89, no. 12 (December), article 128004, DOI 10.1088/0031-8949/89/12/128004.

Seth D. Baum and Itsuki C. Handoh, 2014. Integrating the planetary boundaries and global catastrophic risk paradigmsEcological Economics, vol. 107 (November), pages 13-21, DOI 10.1016/j.ecolecon.2014.07.024.

Anthony M. Barrett, Seth D. Baum, and Kelly R. Hostetler, 2013. Analyzing and reducing the risks of inadvertent nuclear war between the United States and RussiaScience and Global Security, vol. 21, no. 2, pages 106-133, DOI 10.1080/08929882.2013.798984.

Jacob Haqq-Misra, Michael W. Busch, Sanjoy M. Som, and Seth D. Baum, 2013. The benefits and harm of transmitting into spaceSpace Policy, vol. 29, no. 1 (February), pages 40-48, DOI 10.1016/j.spacepol.2012.11.006.