March Newsletter: Policy Outreach

Dear friends,

This month, GCRI welcomes our newest team member, Jared Brown, who will serve as GCRI’s Special Advisor for Government Affairs. Until recently, he worked at the US Congressional Research Service, building practical knowledge of the workings of the US government, especially with respect to emergency management and homeland security. Now, he is applying this knowledge to global catastrophic risk. His work supports the broader global catastrophic risk community’s policy outreach efforts, especially with the US government. We at GCRI are grateful for Mr. Brown’s …

Read More »

February Newsletter: Nuclear War Risk Analysis

Dear friends, In order to most effectively reduce the risk of global catastrophe, it is often essential to have a quantitative understanding of the risk. It is particularly essential when we are faced with decisions that involve tradeoffs between different risks and decisions that require prioritizing among multiple risks. For this reason, GCRI has long been at the forefront of the risk and decision analysis of global catastrophic risk. This month, we announce a new paper, “Reflections on the Risk Analysis of Nuclear War”. This paper summarizes the …

Read More »

December Newsletter: A Turning Point For GCRI

Dear friends,

We believe that GCRI may now be at a turning point. Having established ourselves as leaders in the field of global catastrophic risk, we now seek to scale up the organization so that we can do correspondingly more to address global catastrophic risk. To that end, we have published detailed records of our accomplishments, plans for future work, and financial needs. An overview of this information is contained in our new blog post, Summary of GCRI’s 2018-2019 Accomplishments, Plans, and Fundraising.

To begin scaling up, we …

Read More »

November Newsletter: Organization Updates

Dear friends,

We are currently in the process of implementing some major organization updates. These are making the organization more in line with our ongoing work and future plans.

First, we’ve published new pages for the seven topics that our work focuses on:

Aftermath of Global Catastrophe
Artificial Intelligence
Cross-Risk Evaluation & Prioritization
Nanotechnology
Nuclear War
Risk & Decision Analysis
Solutions & Strategy

Second, we’ve overhauled our affiliates, removing Associates and Junior Associates and creating a new Senior Advisory Council. We are delighted to welcome Gary Ackerman, John Garrick, and Seán Ó hÉigeartaigh as GCRI’s …

Read More »

October Newsletter: The Superintelligence Debate

Dear friends,

When I look at debates about risks from artificial intelligence, I see a lot of parallels with debates over global warming. Both involve global catastrophic risks that are, to a large extent, driven by highly profitable industries. Indeed, today most of the largest corporations in the world are in either the computer technology or fossil fuel industries.

One key difference is that whereas global warming debates have been studied in great detail by many talented researchers, AI debates have barely been studied at all. As …

Read More »

August Newsletter: Long-Term Trajectories

Dear friends,

This month I am proud to announce a new paper, “Long-Term Trajectories of Human Civilization“. The paper calls for attention to the fate of human civilization over time scales of millions, billions, or trillions of years into the future. While most attention goes to nearer-term phenomena, the long-term can be profoundly important to present-day decision-making. For example, one major issue the paper examines is the fate of global catastrophe survivors. How well they fare is a central factor in whether people today should focus …

Read More »

June Newsletter: Summer Talks

Artificial Intelligence

GCRI Associate Roman Yampolskiy gave a talk on AI safety to the Global Challenges Summit 2018 in Astana, Kazakhstan May 17-19.

GCRI Executive Director Seth Baum and GCRI Associate Roman Yampolskiy participated in a workshop on “AI Coordination & Great Powers” hosted by Foresight Institute in San Francisco on June 7.

GCRI Executive Director Seth Baum gave a seminar on “AI Risk, Ethics, Social Science, and Policy” hosted by the University of California, Berkeley Center for Human-Compatible Artificial Intelligence (CHAI) on June 11.

Effective Altruism

GCRI Executive Director Seth Baum …

Read More »

May Newsletter: Molecular Nanotechnology

Dear friends,

It has been a productive month for GCRI, with new papers by several of our affiliates. Here, I would like to highlight one by Steven Umbrello and myself, on the topic of molecular nanotechnology, also known as atomically precise manufacturing (APM).

At present, APM exists only in a crude form, such as the work recognized by the 2016 Nobel Prize in Chemistry. However, it may be able to revolutionize manufacturing, making it inexpensive and easy to produce a wide range of goods, resulting in what …

Read More »

April Newsletter: Nuclear War Impacts

Dear friends,

This month we are announcing a new paper, “A Model for the Impacts of Nuclear War”, co-authored by Tony Barrett and myself. The paper presents a detailed and comprehensive accounting of the many ways that nuclear war can harm human beings and human society. It is a counterpart to the paper we announced last month, “A Model for the Probability of Nuclear War”.

The model has five main branches corresponding to the five main types of impacts of nuclear weapon detonations: thermal radiation, blast, ionizing …

Read More »

March Newsletter: Nuclear War Probability

Dear friends,

This month we are announcing a new paper, “A Model for the Probability of Nuclear War”, co-authored by Robert de Neufville, Tony Barrett, and myself. The paper presents the most detailed accounting of the probability of nuclear war yet available.

The core of the paper is a model covering 14 scenarios for how nuclear war could occur. In 6 scenarios, a state intentionally starts nuclear war. In the other 8, a state mistakenly believes it is under nuclear attack by another state and starts nuclear …

Read More »