The recent US election offers a vivid reminder of how large and seemingly unlikely events can and do sometimes occur. Just as we cannot assume that elections will continue to be won by normal politicians, we also cannot assume that humanity will continue to avoid global catastrophe.
The outcome of this election has many implications for global catastrophic risk, which I outline in a new article in the Bulletin of the Atomic Scientists. To my eyes, the election increases the importance of nuclear weapons risk relative to other risks. It also draws attention to two major political issues: the possible decline of democracy in the US and other Western countries and the possible loss of the post-WWII international order. Political science should play a greater role in the study of global catastrophic risk.
GCRI will continue to monitor these dynamics closely. Indeed, we are well set up for it, given our strong backgrounds in policy, political science, and other social sciences, as well as the full range of the global catastrophic risks. Above all, we seek to clarify what the political trends and events mean for global catastrophic risk and for the opportunities that each of us have to reduce the risk.
This holiday season, please consider donating to GCRI to support our work to study and reduce global catastrophic risk. Your tax-deductible contribution helps keep human civilization intact.
Seth Baum, Executive Director
Jacob Haqq-Misra reviewed Olle Häggström’s Here Be Dragons: Science, Technology and the Future of Humanity for Law, Innovation and Technology.
David Denkenberger was interviewed in Davos, Switzerland for German public broadcasting station Deutschlandfunk (audio in German).
Seth Baum published an article What Trump means for global catastrophic risk in the Bulletin of the Atomic Scientists.
GCRI Director of Research Tony Barrett will host and speak at a symposium on “Current and Future Global Catastrophic Risks” on December 14 as part of the Society for Risk Analysis (SRA) Annual Meeting. SRA is the premier academic and professional society for risk analysis. GCRI has led symposiums at SRA since 2010. The 2016 GCRI symposium features five talks focused on risks from AI and nuclear weapons.
GCRI Associate Roman Yampolskiy gave two talks in Lisbon, Portugal: a talk on “The Dividing Line Between Humans and Machines” at Web Summit on November 8, 2016 and a talk on “Risks of Artificial Superintelligence” at the Champalimaud Centre for the Unknown on November 9, 2016.
Roman Yampolskiy and Seth Baum both presented at the Envision Conference at Princeton University, an event for undergraduate students and early-career professionals in technology fields.