How to Evaluate the Risk of Nuclear War

View the article “How to evaluate the risk of nuclear war”.

This article, published in BBC Future, discusses the quantitative analysis of nuclear war risk. It is written in the context of the Russian invasion of Ukraine and also discusses more general analytical issues, such as found in GCRI’s nuclear war research.

See also the GCRI Statement on the Russian Invasion of Ukraine.

The article begins as follows:

One day last week, I woke up in the morning and looked out the window to see the Sun was shining. My neighbourhood …

Read More »

Recommendations to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan

View GCRI’s submission to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan.

On 1 January 2021, the National AI Initiative Act of 2020 became law as part of the National Defense Authorization Act. The National AI Initiative Act calls for regular updates to the National AI R&D Strategic Plan to include “goals, priorities, and metrics for guiding and evaluating how the agencies carrying out the National AI Initiative will support AI research. The Office of Science and Technology Policy (OSTP) requested input from the …

Read More »

GCRI Statement on the Russian Invasion of Ukraine

The ongoing Russian invasion of Ukraine is already shaping up to be an event of historic proportions. This includes, but is certainly not limited to, its implications for global catastrophic risk. We at GCRI are monitoring the unfolding events with great concern. While it is always important to understand all parties’ perspectives on a conflict, in this case we find ourselves strongly condemning the actions of the Russian government. Our hearts go out to the many people in Ukraine who have been tragically affected by …

Read More »

February Newsletter: Ukraine & Pluralism

Dear friends,

We at GCRI are watching the ongoing Russian invasion of Ukraine with great concern. In addition to the grave harm being inflicted on the Ukrainian people, this invasion also constitutes a large escalation of tensions between Russia and the West and a shooting war adjacent to several NATO countries. In our judgment, this increases the risk of US-Russia or NATO-Russia nuclear war and accompanying nuclear winter. Our hearts go out to the people of Ukraine who are enduring this tragic violence. For the sake …

Read More »

GCRI Statement on Pluralism in the Field of Global Catastrophic Risk

Global catastrophic risk is an important societal issue area. As such, it is to be expected that there will be a variety of views on it. We at GCRI believe that it is important to consider a range of views to better understand the topic of global catastrophic risk and the constructive options for addressing the risk. We are likewise interested in supporting a pluralistic field of global catastrophic risk.

Many types of pluralism can be valuable for the field of global catastrophic risk. Examples include …

Read More »

Greening the Universe: The Case for Ecocentric Space Expansion

View the paper “Greening the Universe: The Case for Ecocentric Space Expansion”

One reason for focusing on global catastrophic risk is because if a global catastrophe occurs, it could prevent human civilization from accomplishing great things in the future. Arguably, some of the greatest things it could accomplish involve expansion into outer space. This paper presents an ecocentric vision for future space expansion, in which human civilization spreads flourishing ecosystems across the cosmos. The paper is part of a broader collection of visions for space exploration …

Read More »

December Newsletter: Thank You & Happy New Year

Dear friends,

As this year comes to a close, we at GCRI would like to formally express our gratitude for your continued support. Support comes in many forms, and we recognize that not everyone has the ability to support us financially. However, we are lucky enough to receive a variety of other helpful forms of support, such as when someone shares our work, reads our research papers, collaborates with us on projects, introduces us to their colleagues, or just finds time to connect with us. We’ve …

Read More »

GCRI Receives $200,000 for 2022 Work on AI

I am delighted to announce that GCRI has received a new $200,000 donation to fund work on AI in 2022 from Gordon Irlam. Irlam had previously made donations funding AI project work conducted in 2021, 2020, and 2019.

All of us at GCRI are grateful for this donation. We are excited to continue our work addressing AI risk.

Our projects for 2022 cover the following topics:

Continuation of prior projects: We will continue work on select projects from previous years.

Further support for the AI and global catastrophic risk talent pools: This project extends …

Read More »

From AI for People to AI for the World and the Universe

View the paper “From AI for People to AI for the World and the Universe”

Work on the ethics of artificial intelligence often focuses on the value of AI to human populations. This is seen, for example, in initiatives on AI for People. These initiatives do well to identify some important AI ethics issues, but they fall short by neglecting the ethical importance of nonhumans. This short paper calls for AI ethics to better account for nonhumans, such as by giving initiatives names like “AI for …

Read More »

November Newsletter: Year in Review

Dear friends,

2021 has been a year of overcoming challenges, of making the most of it under difficult circumstances. The Delta variant dashed hopes for a smooth recovery from the COVID-19 pandemic. Outbreaks surge even in places with high vaccination rates, raising questions of when or even if the pandemic will ever end. As we at GCRI are abundantly aware, it could be a lot worse. But it has still been bad, and we send our condolences to those who have lost loved ones.Despite the circumstances, …

Read More »