GCRI Media Engagement On The Russian Invasion Of Ukraine

This past fall, GCRI conducted a stretch of media outreach regarding the Russian invasion of Ukraine. It focused on the risk of the invasion escalating to nuclear war. It drew on GCRI’s research on nuclear war, especially the publication A model for the probability of nuclear war. This post provides a summary of this outreach work.

The work began with a series of posts by Seth Baum on Twitter, mainly in the form of extended threads. The posts are compiled in a thread of …

Read More »

Summary of the 2022 Advising and Collaboration Program

In May, GCRI put out an open call for people interested in seeking our advice or collaborating on projects with us. This was a continuation of our successful 2019, 2020, and 2021 Advising and Collaboration Programs. The 2022 Program was made possible by continued support from Gordon Irlam.

The GCRI Advising and Collaboration Program is an opportunity for anyone interested in global catastrophic risk to get more involved in the field. There is practically no barrier to entry in the program: the only thing people need …

Read More »

2022 GCRI Fellowship Program

GCRI is pleased to announce the 2022 Fellowship Program. The Fellowship Program aims to highlight exceptional collaborators GCRI had the opportunity to partner with over the course of the year. This year, we have four 2022 Fellows.

The 2022 GCRI Fellows include students and senior professionals hailing from four countries around the world. Their contributions include research across a diverse range of research disciplines including nuclear war risk, misinformation, and artificial intelligence scenario mapping. Their contributions are invaluable and we are confident that they will continue …

Read More »

GCRI Statement on the Ethics of Funding Sources

The field of global catastrophic risk has been jolted by the recent collapse of the cryptocurrency company FTX. Its philanthropic arm, the FTX Future Fund, was, for a brief stretch of time, a major funder of work on global catastrophic risk and related topics. Some projects related to global catastrophic risk were also funded directly by FTX co-founder and former CEO Sam Bankman-Fried.

We at GCRI are deeply saddened by this turn of events. Our hearts go out to all the victims who have suffered from …

Read More »

NAS Workshop Proceedings

On December 17 and 21 of 2021, Executive Director Seth Baum delivered a remote talk called The challenges of addressing rare events and how to overcome them at the workshop Anticipating Rare Events of Major Significance, hosted by the US National Academies of Sciences, Engineering, and Medicine. The proceedings from the workshop can now be found online, and Baum’s remarks from the workshop can be found in Chapter 8, Active Prevention and Deterrence. A summary of his presentation is as follows:

Classic expected utility theory suggests that …

Read More »

June Newsletter: Call For Advisees And Collaborators

Dear friends,

GCRI has recently put out an open call for participants in our 2022 Advising and Collaboration Program. The Program helps people get more involved in work on global catastrophic risk and focus their activities in more successful directions. We welcome people at all career points, from all geographic locations, and with any interest across the many aspects of global catastrophic risk. No significant time commitment is required; participation can range from a one-time call to get advice on how to get more involved to extended …

Read More »

Open Call for Advisees and Collaborators, May 2022

UPDATE: The open call for advisees and collaborators is now closed. Thank you to everyone who applied. However, anyone interested in seeking our advice and/or collaborating with us is still welcome to contact us as per the instructions below and we will include them in our next advisees and collaborators program.

GCRI is currently welcoming inquiries from people who are interested in seeking our advice and/or collaborating with us as part of our fourth annual Advising and Collaboration Program. Inquiries may cover any aspect of global catastrophic …

Read More »

Recommendations to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan

View GCRI’s submission to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan.

On 1 January 2021, the National AI Initiative Act of 2020 became law as part of the National Defense Authorization Act. The National AI Initiative Act calls for regular updates to the National AI R&D Strategic Plan to include “goals, priorities, and metrics for guiding and evaluating how the agencies carrying out the National AI Initiative will support AI research. The Office of Science and Technology Policy (OSTP) requested input from the …

Read More »

December Newsletter: Thank You & Happy New Year

Dear friends,

As this year comes to a close, we at GCRI would like to formally express our gratitude for your continued support. Support comes in many forms, and we recognize that not everyone has the ability to support us financially. However, we are lucky enough to receive a variety of other helpful forms of support, such as when someone shares our work, reads our research papers, collaborates with us on projects, introduces us to their colleagues, or just finds time to connect with us. We’ve …

Read More »

2021 Advising and Collaboration Program Testimonials

The GCRI Advising and Collaboration Program welcomes people from all backgrounds and career points to connect with GCRI, get advice, and collaborate on projects. The program provides personalized experiences that are uniquely tailored to the individual depending upon their particular needs and circumstances. Therefore, every participant gets something different out of the program. Below are testimonials from six participants describing their experiences in the 2021 Advising and Collaboration Program.

Uliana Certan, International relations scholarManon Gouiran, Research intern at the Swiss Center for Affective SciencesAaron Martin, PhD …

Read More »