GCRI studies the relative importance of different global catastrophic risks in order to evaluate tradeoffs and allocation priorities.
Many important decisions require an understanding of multiple global catastrophic risks. These decisions fit into two broad categories. First, there are “risk-risk tradeoffs” in which an action can decrease one risk while increasing another. For example, nuclear power can decrease global warming risk and increase nuclear war risk. Second, there are decisions about how to allocate scarce resources. For example, a philanthropist can donate to a nonprofit working on any of the various global catastrophic risks, but any given dollar can only be donated once. Each of these types of decisions benefits from quantitative comparison of multiple global catastrophic risks.
GCRI is a leader in the cross-risk evaluation of global catastrophic risk. We have developed the concept of integrated assessment of global catastrophic risk as an overarching framework for our work. Our integrated assessment endeavors to put all of the global catastrophic risks into one analysis in order to perform cross-risk evaluation and inform risk-risk tradeoffs and allocation prioritization. This effort is bolstered by our ongoing progress on risk and decision analysis of global catastrophic risk, as well as our expertise in several of the risks. Additionally, we have conducted cross-risk evaluations of several decision topics.
Featured Publications
Towards an integrated assessment of global catastrophic risk
Seth D. Baum and Anthony M. Barrett. Catastrophic and Existential Risk: Proceedings of the First Colloquium, UCLA, 2017
This paper presents a concept for using risk analysis to identify the best ways of reducing the entirety of global catastrophic risk. The concept, dubbed integrated assessment, synthesizes aspects of the definition, ethics, and risk analysis of global catastrophe.
Long-term trajectories of human civilization
Seth D. Baum, Stuart Armstrong, Timoteus Ekenstedt, Olle Häggström, Robin Hanson, Karin Kuhlemann, Matthijs M. Maas, James D. Miller, Markus Salmela, Anders Sandberg, Kaj Sotala, Phil Torres, Alexey Turchin, and Roman V. Yampolskiy. Foresight, 2019, DOI 10.1108/FS-04-2018-0037
This paper evaluates the fate of human civilization millions, billions, or trillions of years into the future. One major theme of the paper is the relative importance of global catastrophes that result in human extinction vs. those that leave some human survivors.
Additional Publications
Rumtin Sepasspour, 2023. All-hazards policy for global catastrophic risk. Global Catastrophic Risk Institute Technical Report 23-1.
Victor Galaz, Miguel A. Centeno, Peter W. Callahan, Amar Causevic, Thayer Patterson, Irina Brass, Seth Baum, Darryl Farber, Joern Fischer, David Garcia, Timon McPhearson, Daniel Jimenez, Brian King, Paul Larcey, and Karen Levy, 2021. Artificial intelligence, systemic risks, and sustainability. Technology in Society, vol. 67, (November), article 101741, DOI 10.1016/j.techsoc.2021.101741.
Seth D. Baum, 2019. Risk-risk tradeoff analysis of nuclear explosives for asteroid deflection. Risk Analysis, vol. 39, no. 11 (November), pages 2427-2442, DOI 10.1111/risa.13339.
Seth D. Baum and Anthony M. Barrett, 2018. Global catastrophes: The most extreme risks. In Vicki Bier (editor), Risk in Extreme Environments: Preparing, Avoiding, Mitigating, and Managing. New York: Routledge, pages 174-184.
Steven Umbrello and Seth D. Baum, 2018. Evaluating future nanotechnology: The net societal impacts of atomically precise manufacturing. Futures, vol. 100 (June), pages 63-73, DOI 10.1016/j.futures.2018.04.007.
Seth D. Baum, 2014. The great downside dilemma for risky emerging technologies. Physica Scripta, vol. 89, no. 12 (December), article 128004, DOI 10.1088/0031-8949/89/12/128004.
Seth D. Baum, Timothy M. Maher Jr., and Jacob Haqq-Misra, 2013. Double catastrophe: Intermittent stratospheric geoengineering induced by societal collapse. Environment, Systems and Decisions, vol. 33, no. 1 (March), pages 168-180, DOI 10.1007/s10669-012-9429-y.