August Newsletter: Long-Term Trajectories

Dear friends,

This month I am proud to announce a new paper, “Long-Term Trajectories of Human Civilization“. The paper calls for attention to the fate of human civilization over time scales of millions, billions, or trillions of years into the future. While most attention goes to nearer-term phenomena, the long-term can be profoundly important to present-day decision-making. For example, one major issue the paper examines is the fate of global catastrophe survivors. How well they fare is a central factor in whether people today should focus on risks of human extinction, risks of sub-extinction global catastrophes, or other issues.

The paper was a large group effort. I am the lead author out of 14 total co-authors, including GCRI affiliates Matthijs Maas and Roman Yampolskiy. The paper is based on a workshop I led at last year’s Workshop on Existential Risk to Humanity at Chalmers University of Technology in Gothenburg, Sweden, organized by Olle Häggström. I’d like to thank all of the co-authors and other workshop participants for making this a much better paper than I could have written on my own.

For more information, please read our announcement on the GCRI blog and read the paper.

Sincerely,
Seth Baum, Executive Director

General Risk

GCRI Executive Director Seth Baum was the lead author of a paper forthcoming in Foresight—with an international group of 13 other scholars including Stuart Armstrong, Olle Häggström, Robin Hanson, Karin Kuhlemann, Anders Sandberg, and GCRI associates Roman Yampolskiy and Matthijs Maas—on the “Long-Term Trajectories of Human Civilization.” They identify four types of potential long-term trajectories: status quo trajectories, in which civilization stays about the same, catastrophe trajectories, in which civilization collapses, technological transformation trajectories, in which radical technology fundamentally changes civilization, and astronomical trajectories, in which civilization expands beyond Earth.

GCRI Executive Director Seth Baum gave a talk “An Evening with the Global Catastrophic Risk Institute” for the Effective Altruism NYC group on August 9.

Artificial Intelligence

GCRI Executive Director Seth Baum is giving a talk titled “Introduction to Artificial Intelligence Research” at Tech2025 on August 14 in New York City.

GCRI Associate Roman Yampolskiy gave the keynote address at the Techno Security & Digital Forensics conference in Myrtle Beach, SC on June 4. Yampolskiy was also interviewed about “AI Safety, Possible Minds, and Simulated Worlds” on the Future of Life podcast and about “Artificial Intelligence, Risk, and Alignment” on Economics Detective Radio.

GCRI Associate Dave Denkenberger co-authored a paper with Alexey Turchin titled “Classification of Global Solutions for the AI Safety Problem” that won a top prize in GoodAI’s General AI Challenge.

Asteroid Risk

GCRI Executive Director Seth Baum’s paper on “Uncertain Human Consequences in Asteroid Risk Analysis and the Global Catastrophe Threshold” is forthcoming in Natural Hazards.

This post was written by
Robert de Neufville is Director of Communications of the Global Catastrophic Risk Institute.
Comments are closed.