How to Evaluate the Risk of Nuclear War

View the article “How to evaluate the risk of nuclear war”.

This article, published in BBC Future, discusses the quantitative analysis of nuclear war risk. It is written in the context of the Russian invasion of Ukraine and also discusses more general analytical issues, such as found in GCRI’s nuclear war research.

See also the GCRI Statement on the Russian Invasion of Ukraine.

The article begins as follows:

One day last week, I woke up in the morning and looked out the window to see the Sun was shining. My neighbourhood …

Read More »

GCRI Statement on the Russian Invasion of Ukraine

The ongoing Russian invasion of Ukraine is already shaping up to be an event of historic proportions. This includes, but is certainly not limited to, its implications for global catastrophic risk. We at GCRI are monitoring the unfolding events with great concern. While it is always important to understand all parties’ perspectives on a conflict, in this case we find ourselves strongly condemning the actions of the Russian government. Our hearts go out to the many people in Ukraine who have been tragically affected by …

Read More »

February Newsletter: Ukraine & Pluralism

Dear friends,

We at GCRI are watching the ongoing Russian invasion of Ukraine with great concern. In addition to the grave harm being inflicted on the Ukrainian people, this invasion also constitutes a large escalation of tensions between Russia and the West and a shooting war adjacent to several NATO countries. In our judgment, this increases the risk of US-Russia or NATO-Russia nuclear war and accompanying nuclear winter. Our hearts go out to the people of Ukraine who are enduring this tragic violence. For the sake …

Read More »

GCRI Statement on Pluralism in the Field of Global Catastrophic Risk

Global catastrophic risk is an important societal issue area. As such, it is to be expected that there will be a variety of views on it. We at GCRI believe that it is important to consider a range of views to better understand the topic of global catastrophic risk and the constructive options for addressing the risk. We are likewise interested in supporting a pluralistic field of global catastrophic risk.

Many types of pluralism can be valuable for the field of global catastrophic risk. Examples include …

Read More »

GCRI Receives $200,000 for 2022 Work on AI

I am delighted to announce that GCRI has received a new $200,000 donation to fund work on AI in 2022 from Gordon Irlam. Irlam had previously made donations funding AI project work conducted in 2021, 2020, and 2019.

All of us at GCRI are grateful for this donation. We are excited to continue our work addressing AI risk.

Our projects for 2022 cover the following topics:

Continuation of prior projects: We will continue work on select projects from previous years.

Further support for the AI and global catastrophic risk talent pools: This project extends …

Read More »

2021 Annual Report

2021 has been a good year for GCRI. Our productivity is up relative to previous years, boosted by a growing team and rich network of outside collaborators. Our work over the past year is broadly consistent with the plans we outlined one year ago. We have adjusted well to the new realities of the COVID-19 pandemic, aided by the fact that GCRI was designed for remote collaboration from the start. Because of the pandemic, there is a sense in which no one has had a truly great …

Read More »

Summary of the 2021 Advising and Collaboration Program

In May, GCRI put out an open call for people interested in seeking our advice or collaborating on projects with us. This was a continuation of our successful 2019 and 2020 Advising and Collaboration Programs. We anticipate conducting future iterations of the program in 2022 and beyond. The 2021 Program was made possible by generous support from Gordon Irlam and the Survival and Flourishing Fund.

The GCRI Advising and Collaboration Program is an opportunity for anyone interested in global catastrophic risk to get more involved in the field. There is practically no barrier to entry …

Read More »

Collaborative Publishing with GCRI

Global catastrophic risk is a highly complex, interdisciplinary topic. It benefits from contributions from many people with a variety of backgrounds. For this reason, GCRI emphasizes collaborative publishing. We publish extensively with outside scholars at all career points, including early-career scholars who are relatively new to the field, as well as mid-career and senior scholars at other organizations who bring complementary expertise.

This post describes our approach to collaborative publishing and documents our collaborative publications. Researchers interested in publishing with GCRI should visit our get involved page. The …

Read More »

The Ethics of Sustainability for Artificial Intelligence

View the paper “The Ethics of Sustainability for Artificial Intelligence”

Access the data used in the paper. 

AI technology can have significant effects on domains associated with sustainability, such as certain aspects of human society and the natural environment. Sustainability itself is widely regarded as a good thing, including in recent initiatives on AI and sustainability. There is therefore a role for ethical analysis to clarify what is meant by sustainability and the ways in which sustainability in the context of AI might or might not …

Read More »

Artificial Intelligence Needs Environmental Ethics

View the paper “Artificial Intelligence Needs Environmental Ethics”

Artificial intelligence is an interdisciplinary topic. As such, it benefits from contributions from a wide range of disciplines. This short paper calls for greater contributions from the discipline of environmental ethics and presents several types of contributions that environmental ethicists can make.

First, environmental ethicists can raise the profile of the environmental dimensions of AI. For example, discussions of the ethics of autonomous vehicles have thus far focused mainly on “trolley problem” scenarios in which the vehicle must decide …

Read More »