Risk Analysis and Risk Management for the Artificial Superintelligence Research and Development Process

View the paper “Risk Analysis and Risk Management for the Artificial Superintelligence Research and Development Process”

Already computers can outsmart humans in specific domains, like multiplication. But humans remain firmly in control… for now. Artificial superintelligence (ASI) is AI with intelligence that vastly exceeds humanity’s across a broad range of domains. Experts increasingly believe that ASI could be built sometime in the future, could take control of the planet away from humans, and could cause a global catastrophe. Alternatively, if ASI is built safely, it may …

Read More »

Winter-Safe Deterrence as a Practical Contribution to Reducing Nuclear Winter Risk: A Reply

View the paper “Winter-Safe Deterrence as a Practical Contribution to Reducing Nuclear Winter Risk: A Reply”

In a recent issue of this journal, I published an article proposing the concept of winter-safe deterrence. The article defined winter-safe deterrence as “military force capable of meeting the deterrence goals of today’s nuclear weapon states without risking catastrophic nuclear winter”. The article and a summary version published in the Bulletin of the Atomic Scientists have since stimulated extensive discussion in social media, the Bulletin, and now a symposium in this journal. The discussion has been productive for refining certain …

Read More »

FLI Artificial Superintelligence Project

I am writing to announce that GCRI has received a grant from the Future of Life Institute, with funding provided by Elon Musk and the Open Philanthropy Project. The official announcement is here and the full list of awardees is here.

GCRI’s project team includes Tony Barrett, Roman Yampolskiy, and myself. Here is the project title and summary:

Evaluation of Safe Development Pathways for Artificial Superintelligence

Some experts believe that computers could eventually become a lot smarter than humans are. They call it artificial superintelligence, or ASI. If …

Read More »

June Newsletter: The Winter-Safe Deterrence Controversy

Dear friends,

The last few months have gone well for GCRI. We have several new papers out, two new student affiliates, and some projects in the works that I hope to announce in an upcoming newsletter. Meanwhile, I’d like to share with you about a little controversy we recently found ourselves in.

The controversy surrounds a new research paper of mine titled Winter-safe deterrence: The risk of nuclear winter and its challenge to deterrence. The essence of winter-safe deterrence is to seek options for deterrence that would …

Read More »

Resilience to Global Food Supply Catastrophes

View the paper “Resilience to Global Food Supply Catastrophes”

A global catastrophic risk is a risk of an event that would cause major harm to global human civilization. Many global catastrophic risks are risks of global food supply catastrophes because they threaten major disruption to global food supplies. These include risks of nuclear wars, volcanic eruptions, asteroid and comet impacts, abrupt climate change, and plant disease outbreaks. Global food supply catastrophes are an important class of global catastrophic risk. This paper studies how to make humanity …

Read More »

Risk and Resilience For Unknown, Unquantifiable, Systemic, and Unlikely/Catastrophic Threats

View the paper “Risk and Resilience For Unknown, Unquantifiable, Systemic, and Unlikely/Catastrophic Threats”

Risk and resilience are important paradigms for guiding decisions made under uncertainty, in particular decisions about how to protect systems from threats. The risk paradigm tends to emphasize reducing the probabilities and magnitudes of potential losses. The resilience paradigm tends to emphasize increasing the ability of systems to retain critical functionality by absorbing the disturbance, adapting to it, or recovering from it. This paper discusses the suitability of each paradigm for threats that …

Read More »

Winter-Safe Deterrence: The Risk of Nuclear Winter and Its Challenge to Deterrence

View the paper “Winter-Safe Deterrence: The Risk of Nuclear Winter and Its Challenge to Deterrence”

Eight countries have large nuclear arsenals: China, France, India, Israel, Pakistan, Russia, the United Kingdom, and the United States. North Korea might have a small nuclear arsenal. These countries have nuclear weapons for several reasons. Perhaps the biggest reason is deterrence. Nuclear deterrence means threatening other countries with nuclear weapons in order to persuade them not to attack. When nuclear deterrence works, it can help avoid nuclear war. However, nuclear deterrence …

Read More »

February Newsletter: New Directions For GCRI

Dear friends,

I am delighted to announce important changes in GCRI’s identity and direction. GCRI is now just over three years old. In these years we have learned a lot about how we can best contribute to the issue of global catastrophic risk. Initially, GCRI aimed to lead a large global catastrophic risk community while also performing original research. This aim is captured in GCRI’s original mission statement, to help mobilize the world’s intellectual and professional resources to meet humanity’s gravest threats.

Our community building has been …

Read More »

Global Catastrophes: The Most Extreme Risks

View the paper “Global Catastrophes: The Most Extreme Risks”

The most extreme risk are those that threaten the entirety of human civilization, known as global catastrophic risks. The very extreme nature of global catastrophes makes them both challenging to analyze and important to address. They are challenging to analyze because they are largely unprecedented and because they involve the entire global human system. They are important to address because they threaten everyone around the world and future generations. Global catastrophic risks also pose some deep dilemmas. …

Read More »

January Newsletter: Vienna Conference on Nuclear Weapons

Dear friends,

In December, I had the honor of speaking at the Vienna Conference on the Humanitarian Impact of Nuclear Weapons, hosted by the Austrian Foreign Ministry in the lavish Hofburg Palace. The audience was 1,000 people representing 158 national governments plus leading nuclear weapons NGOs, experts, and members of the media.

My talk “What is the risk of nuclear war?” presented core themes from the risk analysis of nuclear war. I explained that each of us is, on average, more likely to die from nuclear war …

Read More »