GCRI Media Engagement On The Russian Invasion Of Ukraine

This past fall, GCRI conducted a stretch of media outreach regarding the Russian invasion of Ukraine. It focused on the risk of the invasion escalating to nuclear war. It drew on GCRI’s research on nuclear war, especially the publication A model for the probability of nuclear war. This post provides a summary of this outreach work.

The work began with a series of posts by Seth Baum on Twitter, mainly in the form of extended threads. The posts are compiled in a thread of …

Read More »

Recommendations to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan

View GCRI’s submission to the OSTP on the National Artificial Intelligence Research and Development Strategic Plan.

On 1 January 2021, the National AI Initiative Act of 2020 became law as part of the National Defense Authorization Act. The National AI Initiative Act calls for regular updates to the National AI R&D Strategic Plan to include “goals, priorities, and metrics for guiding and evaluating how the agencies carrying out the National AI Initiative will support AI research. The Office of Science and Technology Policy (OSTP) requested input from the …

Read More »

Open Call for Advisees and Collaborators, May 2021

UPDATE: The open call for advisees and collaborators is now closed. Thank you to everyone who applied. However, anyone interested in seeking our advice and/or collaborating with us is still welcome to contact us as per the instructions below and we will include them in our next advisees and collaborators program.

GCRI is currently welcoming inquiries from people who are interested in seeking our advice and/or collaborating with us. Inquiries can concern any aspect of global catastrophic risk. We welcome inquiries from people at any career …

Read More »

Summary of January-July 2020 Advising and Collaboration Program

In January, GCRI put out an open call for people interested in seeking our advice or collaborating on projects with us. This was a continuation of last year’s successful advising and collaboration program. We anticipate conducting a second round of the program later in 2020. The 2020 programs are made possible by generous support from Gordon Irlam.

This first 2020 program focused on a number of AI projects that are also supported by Irlam. Program participants were mostly people interested in AI risk, ethics, and policy. …

Read More »

Call for Advisees and Collaborators for Select AI Projects, January 2020

UPDATE June 9: We are not taking on new participants to this advisees and collaborators program right now. However, anyone interested in seeking our advice and/or collaborating with us is still welcome to contact us as per the instructions below and we will include them in our next advisees and collaborators program.

GCRI is currently welcoming inquiries from people who are interested in seeking our advice and/or collaborating with us on select AI projects, which are detailed below. We have some funding available for select project …

Read More »

My Experience With The GCRI Advising/Collaboration Program: A Junior Collaborator From Singapore

I am a recent college graduate working on AI policy with an emphasis on global catastrophic risk. This past April, I moved back to my home country, Singapore, to start as a Research Associate at the Centre for AI and Data Governance at Singapore Management University. I first got in touch with GCRI during my time at university, and reconnected over the summer as part of its advising/collaboration program.

The Centre for AI and Data Governance is
fairly new. It mainly focuses on scholarship of Singapore law …

Read More »

Summary of 2019 Advising and Collaboration Program

In May, GCRI put out an open call for people interested in seeking our advice and/or collaborating with us on projects. This was a new initiative for us, enabled by funding we received last year. It was also a bit of an experiment: we did not know how much interest there would be, or how constructive it would be either for the advisees/collaborators or for us.

We are
quite happy with how the advising/collaboration program turned out. We received
inquiries from many talented people from around the …

Read More »

Open Call for Advisees and Collaborators, May 2019

GCRI is currently welcoming inquiries from people who are
interested in seeking our advice and/or collaborating with us on projects.

At this time we are holding a preliminary, informal, open call for people at all career points interested in working on any aspect of global catastrophic risk. We cannot engage with everyone, but will try to talk to as many interested people as we can.

Of particular interest:

Scholars and professionals at intermediate and senior career points (Ph.D. graduates and up, or equivalent professional experience) whose work overlaps with …

Read More »

March Newsletter: Policy Outreach

Dear friends,

This month, GCRI welcomes our newest team member, Jared Brown, who will serve as GCRI’s Special Advisor for Government Affairs. Until recently, he worked at the US Congressional Research Service, building practical knowledge of the workings of the US government, especially with respect to emergency management and homeland security. Now, he is applying this knowledge to global catastrophic risk. His work supports the broader global catastrophic risk community’s policy outreach efforts, especially with the US government. We at GCRI are grateful for Mr. Brown’s …

Read More »

Updated Donate Page: 3 Reasons To Support GCRI

To help you decide whether to donate to GCRI, we have updated our donate page with three reasons why GCRI is an excellent organization to support:

1. Global catastrophic risk is an important cause. Global catastrophes can affect everyone around the world. Preventing global catastrophes is an efficient way to help a lot of people. Humanity today faces some worrisome threats, making global catastrophic risk an urgent issue to address.

2. GCRI does important work on global catastrophic risk. We help society figure out how it can …

Read More »