September Newsletter: AI, Nuclear War, and News Projects

Dear friends,

I’m delighted to announce three new funded projects. Two of them are for risk modeling, on artificial intelligence and nuclear war. These follow directly from our established nuclear war and emerging technologies research projects. The third is for covering current events across the breadth of global catastrophic risk topics. This follows directly from our news summaries. It is an honor to be recognized for our work and to have the opportunity to expand it. Please stay tuned as these projects unfold.

As always, thank you for your interest in our work. We welcome any comments, questions, and criticisms you may have.

Sincerely,
Seth Baum, Executive Director

GCR News Summaries

Here are Robert de Neufville’s monthly news summaries for June, July, and August. As always, these summarize recent events across the breadth of GCR topics.

Society for Risk Analysis 2015 Annual Meeting

The Society for Risk Analysis is the leading academic and professional society for all aspects of risk. GCRI hosts sessions on global catastrophic risk each year, bringing together leading experts in the field. This year, GCRI is hosting two sessions, one on armed conflict (which includes local- and global-scale conflict) and one on global catastrophic risks in general. The conference is 6-10 December in Arlington, VA and GCRI’s sessions are on the 7th and 8th. Session details are available here.

New Grant: Artificial Intelligence

GCRI has received a grant on artificial intelligence risk from a grant competition hosted by the Future of Life Institute with funding from Elon Musk and the Open Philanthropy Project. The project team includes Tony Barrett, Roman Yampolskiy, and Seth Baum. The project title is “Evaluation of Safe Development Pathways for Artificial Superintelligence”. Details are available here.

New Grant: Nuclear War

GCRI has received a grant on nuclear war risk from the Global Challenges Foundation. The grant is for modeling of the probability and severity of a range of nuclear war scenarios. Details are available here.

New Grant: GCR Current Events

GCRI has received a grant on GCR current events funded by the Global Challenges Foundation. GCRI will track current events across the breadth of global catastrophic risk topics. Details are available here.

New Paper: AI Risk

The first paper has been published from GCRI’s new line of research on artificial intelligence risk. The paper is Risk analysis and risk management for the artificial superintelligence research and development process, authored by Tony Barrett and Seth Baum.

New Science Article: Biological Weapons

GCRI Associate Gary Ackerman, together with colleagues Crystal Boddie, Matthew Watson, and Gigi Kwik Gronvall, published a paper Assessing the bioweapons threat in the journal Science. The paper presents a Delphi survey of 62 leading experts on the likelihood of a large-scale biological attack within the next 10 years and the likelihood of actionable intelligence about the attack.

Symposium: Winter-Safe Deterrence

The journal Contemporary Security Policy has published a symposium on Seth Baum’s paper Winter-safe deterrence: The risk of nuclear winter and its challenge to deterrence. The symposium features contributions from international security experts Aaron Karp & Regina Karp, Christian Enemark, Jean Pascal Zanders, and Patricia Lewis, as well as a reply by Baum. Symposium details are available here.

New Popular Articles

Seth Baum and Trevor White have an article When robots kill published in The Guardian’s Political Science blog. This discusses AI risks from driverless cars to superintelligence.

Seth Baum has two new articles in the Bulletin of the Atomic Scientists:

A picture’s power to prevent, on the significance of the 70th anniversary of the Hiroshima and Nagasaki bombings.

Breaking down the risk of nuclear deterrence failure, on the risk of major war with vs. without nuclear weapons as this relates to the decision of whether to rapidly disarm nuclear weapons.

Sneak Preview: Futures Special Issue

Confronting Future Catastrophic Threats to Humanity will be a special issue of the journal Futures co-edited by Seth Baum of GCRI and Bruce Tonn of University of Tennessee. The issue is now in production and a ‘sneak preview’ is online here with preprints for most articles.

This post was written by
Seth Baum is Executive Director of the Global Catastrophic Risk Institute.
Comments are closed.