GCRI Affiliates Overhaul

GCRI has made several major changes to our roster of affiliates, as reflected on our People page. These changes make our listing of affiliates more consistent with how GCRI is actually operating at this time and prepares us for future directions we hope to pursue.

First, the GCRI leadership team now consists only of Tony Barrett (Director of Research), Robert de Neufville (Director of Communications), and myself (Executive Director). Grant Wilson (Deputy Director) has been removed. Grant has made excellent contributions since the early days of …

Read More »

New Topics Pages

GCRI is in the process of several general organization updates. The first is a new collection of topics pages published on our website. They cover the major topics that GCRI currently works on. These replace our previous collection of projects pages, which had fallen out of date. The new topics pages briefly summarize the topic itself and GCRI’s work on it, and then list GCRI’s publications on the topic.

The topics are as follows:

Aftermath of Global Catastrophe. How well would survivors of global catastrophe fare? This …

Read More »

Superintelligence Skepticism As A Political Tool

View the paper “Superintelligence Skepticism as a Political Tool”

For decades, there have been efforts to exploit uncertainty about science and technology for political purposes. This practice traces to the tobacco industry’s effort to sow doubt about the link between tobacco and cancer, and it can be seen today in skepticism about climate change and other major risks. This paper analyzes the possibility that the same could happen for the potential future artificial intelligence technology known as superintelligence.

Artificial superintelligence is AI that is much smarter than …

Read More »

Uncertain Human Consequences in Asteroid Risk Analysis and the Global Catastrophe Threshold

View the paper “Uncertain Human Consequences in Asteroid Risk Analysis and the Global Catastrophe Threshold”

Asteroid collision is probably the most well-understood global catastrophic risk. This paper shows that it’s not so well understood after all, due to uncertainty in the human consequences. This finding matters both for asteroid risk and for the wider study of global catastrophic risk. If asteroid risk is not well understood, then neither are other risks such as nuclear war and pandemics.

In addition to our understanding of the risks, two other …

Read More »

Long-Term Trajectories of Human Civilization

View the paper “Long-Term Trajectories of Human Civilization”

Society today needs greater attention to the long-term fate of human civilization. Important present-day decisions can affect what happens millions, billions, or trillions of years into the future. The long-term effects may be the most important factor for present-day decisions and must be taken into account. An international group of 14 scholars calls for the dedicated study of “long-term trajectories of human civilization” in order to understand long-term outcomes and inform decision-making. This new approach is presented in …

Read More »

GCRI Symposium on Global Catastrophic Risk at SRA 2016

GCRI will lead a symposium on global catastrophic risk at the 2016 meeting of the Society for Risk Analysis (SRA), December 11-15 in San Diego. SRA is the premier academic and professional society for risk analysis. GCRI has led symposiums at SRA meetings since 2010. The 2016 GCRI symposium features five talks focused on risks from artificial intelligence and nuclear weapons.

Read More »

Seth Baum Appointed to Advisory Board of AI & Society

GCRI’s Seth Baum has been appointed to the Advisory Board of the journal AI & Society. AI & Society is an interdisciplinary journal focused on societal issues related to artificial intelligence including the design, use, management, and policy of information, communications and new media technologies, with a particular emphasis on cultural, social, cognitive, economic, ethical, and philosophical implications.

Read More »

New Global Challenges Foundation Projects

GCRI has two new funded projects with the Global Challenges Foundation, a philanthropic foundation based in Stockholm.

The first project is a quarterly report of everything going on in the world of global catastrophic risks. The reports will be an expanded version of our monthly news summaries, with some new features and an emphasis on work going on around the world to reduce the risks.

The second project is a risk analysis of nuclear war. Prior GCRI nuclear war research modeled the probability of specific nuclear war …

Read More »

FLI Artificial Superintelligence Project

I am writing to announce that GCRI has received a grant from the Future of Life Institute, with funding provided by Elon Musk and the Open Philanthropy Project. The official announcement is here and the full list of awardees is here.

GCRI’s project team includes Tony Barrett, Roman Yampolskiy, and myself. Here is the project title and summary:

Evaluation of Safe Development Pathways for Artificial Superintelligence

Some experts believe that computers could eventually become a lot smarter than humans are. They call it artificial superintelligence, or ASI. If …

Read More »

GCRI Is Joining Social & Environmental Entrepreneurs

We are pleased to announce that GCRI is joining a new fiscal sponsor, Social and Environmental Entrepreneurs (SEE). We are enthusiastic about SEE and look forward to partnering with them in this new chapter of GCRI’s existence. This also means that we are legally separating from our previous fiscal sponsor, Blue Marble Space.

What is a fiscal sponsor?

A fiscal sponsor is an umbrella organization for US nonprofits. A nonprofit project may seek fiscal sponsorship when it is too small to justify becoming a stand-alone organization, and/or …

Read More »