Nonhuman Value: A Survey of the Intrinsic Valuation of Natural and Artificial Nonhuman Entities

View the paper “Nonhuman Value: A Survey of the Intrinsic Valuation of Natural and Artificial Nonhuman Entities”

The concept of global catastrophic
risk is customarily defined in human terms. Details vary, but a global
catastrophe is almost always regarded as something bad that happens to
humans. However, in moral philosophy, it is often considered that things
that happen to nonhumans can also be bad—and likewise for good things. In some
circumstances, whether and how nonhumans are valued may be the difference
between extremely good or catastrophically bad outcomes for nonhumans. This
raises the …

Read More »

Space Expansion Must Support Sustainability – On Earth and in Space

View the article “Space Expansion Must Support Sustainability – On Earth and in Space”.

This article, published with the Royal United Services
Institute, discusses the role of sustainability when expanding human activities
into outer space. The article illustrates how a framework for space expansion
is being set right now, but that this framework risks expanding unsustainable practices
and paradigms into space. Consequently, global civilization risks wasting
immense amounts of resources and even failing to sustain humanity at worst. In response,
the article suggests five points of emphasis for a robust sustainability …

Read More »

June Newsletter: Call For Advisees And Collaborators

Dear friends,

GCRI has recently put out an open call for participants in our 2022 Advising and Collaboration Program. The Program helps people get more involved in work on global catastrophic risk and focus their activities in more successful directions. We welcome people at all career points, from all geographic locations, and with any interest across the many aspects of global catastrophic risk. No significant time commitment is required; participation can range from a one-time call to get advice on how to get more involved to extended …

Read More »

Open Call for Advisees and Collaborators, May 2022

UPDATE: The open call for advisees and collaborators is now closed. Thank you to everyone who applied. However, anyone interested in seeking our advice and/or collaborating with us is still welcome to contact us as per the instructions below and we will include them in our next advisees and collaborators program.

GCRI is currently welcoming inquiries from people who are interested in seeking our advice and/or collaborating with us as part of our fourth annual Advising and Collaboration Program. Inquiries may cover any aspect of global catastrophic …

Read More »

Book Review: The Precipice

View the paper “Book review: The Precipice”

Book review of The Precipice: Existential Risk and the Future of Humanity, by Toby Ord, Hachette Books, 2020.

The new book The Precipice by Toby Ord provides a wide-ranging survey of topics related to global catastrophic risk. Compared to other books on global catastrophic risk, The Precipice stands out for its depth of discussion, its quality of scholarship, and its readability. However, the book errs in its emphasis on only the most extreme global catastrophe scenarios, its strong belief in the resilience of civilization, and …

Read More »

Pandemic Refuges: Lessons from Two Years of COVID-19

View the paper “Pandemic Refuges: Lessons from Two Years of COVID-19”

Refuges have been proposed as a means of ensuring that at least some people survive a global catastrophe. While it would be better to avoid the catastrophe in the first place, if a catastrophe is to occur, a refuge could be a real difference-maker in terms of the long-term effects on human civilization. Prior refuges research emphasizes highly isolated locations such as underground, underwater, or in outer space. These exotic concepts may seem far removed …

Read More »

March Newsletter: Implications of the War in Ukraine

Dear friends,The Russian invasion of Ukraine is already proving to be an event of profound importance for global catastrophic risk. As detailed in the GCRI Statement on the Russian Invasion of Ukraine, the war’s implications for nuclear war risk are especially strong, but it also has implications for other risks including climate change, pandemics, and artificial intelligence. These changes are coming from the war itself and from the accompanying shifts in global politics. We at GCRI hope that the war can reach a prompt and peaceful …

Read More »

Early Reflections and Resources on the Russian Invasion of Ukraine

View the article “Early Reflections and Resources on the Russian Invasion of Ukraine”.

This article, published in the Effective Altruism Forum, presents analysis of the Russian invasion of Ukraine written for a global catastrophic risk audience. The article discusses nuclear war risk, the changing geopolitical landscape, and recommendations for personal preparedness and philanthropy. It also describes the author’s own activities in addressing the immediate risk and presents a compilation of resources for learning more about the war.

See also the GCRI Statement on the Russian Invasion of Ukraine.

The …

Read More »

How to Evaluate the Risk of Nuclear War

View the article “How to evaluate the risk of nuclear war”.

This article, published in BBC Future, discusses the quantitative analysis of nuclear war risk. It is written in the context of the Russian invasion of Ukraine and also discusses more general analytical issues, such as found in GCRI’s nuclear war research.

See also the GCRI Statement on the Russian Invasion of Ukraine.

The article begins as follows:

One day last week, I woke up in the morning and looked out the window to see the Sun was shining. My neighbourhood …

Read More »

Recommendations to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan

View GCRI’s submission to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan.

On 1 January 2021, the National AI Initiative Act of 2020 became law as part of the National Defense Authorization Act. The National AI Initiative Act calls for regular updates to the National AI R&D Strategic Plan to include “goals, priorities, and metrics for guiding and evaluating how the agencies carrying out the National AI Initiative will support AI research. The Office of Science and Technology Policy (OSTP) requested input from the …

Read More »