Risk-Risk Tradeoff Analysis of Nuclear Explosives for Asteroid Deflection

View the paper “Risk-Risk Tradeoff Analysis of Nuclear Explosives for Asteroid Deflection”

If an asteroid is found to be on collision course with Earth, it may be possible to deflect it away. One way of deflecting asteroids would be to use nuclear explosives. A nuclear deflection program may reduce the risk of an asteroid collision, but it might also inadvertently increase the risk of nuclear war or other violent conflict. This paper analyzes this potential tradeoff and evaluates its policy implications. The paper is published in …

Read More »

Why Catastrophes Can Change the Course of Humanity

BBC Future just published an essay by GCRI Executive Director Seth Baum titled “Why Catastrophes Can Change the Course of Humanity” on global catastrophes and the long-term fate of human civilization. The essay argues that we need to consider the impact of potential catastrophes not just on people alive today but also on the long-term future of humanity.

The essay draws on a paper Dr. Baum wrote with a group of scholars including Olle Häggström, Robin Hanson, Karin Kuhlemann, Anders Sandberg, and Roman …

Read More »

Reflections on the Risk Analysis of Nuclear War

View the paper “Reflections on the Risk Analysis of Nuclear War”

Would the world be safer with or without nuclear weapons? On one hand, nuclear weapons may increase the severity of war due to their extreme explosive power. On the other hand, they may decrease the frequency of major wars by strengthening deterrence. Is the decrease in frequency enough to offset the increase in severity? (This tradeoff is illustrated in the graphic above.) This is a vital policy question for which risk analysis has …

Read More »

Resilience To Global Catastrophe

View the paper “Resilience To Global Catastrophe”

One of the most important questions in the study of global catastrophic risk is how resilient global human civilization is to catastrophes. At stake here is what range of events could cause global catastrophe, and likewise how wide the scope of work on global catastrophic risk should be. A resilient civilization would only fall to a narrow range of catastrophes, and our focus could be correspondingly narrow. This short paper summarizes the state of knowledge on resilience to global …

Read More »

Countering Superintelligence Misinformation

View the paper “Countering Superintelligence Misinformation”

In any public issue, having the right information can help us make the right decisions. This holds in particular for high-stakes issues like the global catastrophic risks. Unfortunately, sometimes incorrect information, or misinformation, is spread. When this happens, it is important to set the record straight.

This paper studies misinformation about artificial superintelligence, which is AI that is much smarter than humans. Current AI is not superintelligent, but if superintelligence is built, it could have massive consequences. Misinformation about superintelligence could …

Read More »

Superintelligence Skepticism As A Political Tool

View the paper “Superintelligence Skepticism as a Political Tool”

For decades, there have been efforts to exploit uncertainty about science and technology for political purposes. This practice traces to the tobacco industry’s effort to sow doubt about the link between tobacco and cancer, and it can be seen today in skepticism about climate change and other major risks. This paper analyzes the possibility that the same could happen for the potential future artificial intelligence technology known as superintelligence.

Artificial superintelligence is AI that is much smarter than …

Read More »

Uncertain Human Consequences in Asteroid Risk Analysis and the Global Catastrophe Threshold

View the paper “Uncertain Human Consequences in Asteroid Risk Analysis and the Global Catastrophe Threshold”

Asteroid collision is probably the most well-understood global catastrophic risk. This paper shows that it’s not so well understood after all, due to uncertainty in the human consequences. This finding matters both for asteroid risk and for the wider study of global catastrophic risk. If asteroid risk is not well understood, then neither are other risks such as nuclear war and pandemics.

In addition to our understanding of the risks, two other …

Read More »

Long-Term Trajectories of Human Civilization

View the paper “Long-Term Trajectories of Human Civilization”

Society today needs greater attention to the long-term fate of human civilization. Important present-day decisions can affect what happens millions, billions, or trillions of years into the future. The long-term effects may be the most important factor for present-day decisions and must be taken into account. An international group of 14 scholars calls for the dedicated study of “long-term trajectories of human civilization” in order to understand long-term outcomes and inform decision-making. This new approach is presented in …

Read More »

Bulletin of Atomic Scientists: Trump and Global Catastrophic Risk

GCRI Executive Director Seth Baum has a new article in Bulletin of the Atomic Scientists on what Donald Trump’s election means for global catastrophic risk, which has been covered in Quartz and Elite Daily. Baum writes that the fact that Trump will have the authority to launch nuclear weapons should particularly concern us, given his tendency to behave erratically, Trump’s election also has implications for the prospect of conflict with Russia and China, the stability of the global world order, the survival of democracy in the US, and our …

Read More »

Scientific American: Should We Let Uploaded Brains Take Over?

GCRI Executive Director Seth Baum has guest blog post in Scientific American on whether we should “upload” our brains to electronic computers. He argues that while there might be substantial benefits to uploading our brains this way, the technology would create new risks as well. While it may be decades or even centuries before we have the technical ability to emulate human brains, we should begin to consider those risks now.

Read More »