As part of its research, GCRI is working on several research papers for peer-reviewed publication. The first of these papers has now been accepted, and so we are adding a publications page to our website. We will be adding more publications to this page as they become available.
The first paper, by Grant Wilson, is titled “Minimizing global catastrophic and existential risks from emerging technologies through international law“. The paper has been accepted at Virginia Environmental Law Journal and can be downloaded from the Social Science Research Network. Grant has written two non-technical summaries of the paper:
Short Summary
“Emerging technologies” like bioengineering, nanotechnology, and artificial intelligence are rapidly developing. While emerging technologies could remedy some of the world’s largest problems — climate change, diseases, hunger, and so forth — they also pose a small risk of causing global catastrophes or even wiping out the entire human population. Despite these high stakes, existing treaties do not sufficiently regulate emerging technologies, so this paper recommends a potential treaty that would provide adequate safeguards.
Long Summary
Mankind is rapidly developing “emerging technologies” in the fields of bioengineering, nanotechnology, and artificial intelligence that have the potential to solve humanity’s biggest problems, such as by curing all disease, extending human life, or mitigating massive environmental problems like climate change. However, if these emerging technologies are misused or have an unintended negative effect, the consequences could be enormous, potentially resulting in serious, global damage to humans (known as “global catastrophic harm”) or severe, permanent damage to the Earth—including, possibly, human extinction (known as “existential harm”). The chances of a global catastrophic risk or existential risk actually materializing are relatively low, but mankind should be careful when a losing gamble means massive human death and irreversible harm to our planet. While international law has become an important source of global regulation for other global risks like climate change and biodiversity loss, emerging technologies do not fall neatly within existing international regimes, and thus any country is more or less free to develop these potentially dangerous technologies without practical safeguards that would curtail the risk of a catastrophic event. In light of these problems, this paper serves to discuss the risks associated with bioengineering, nanotechnology, and artificial intelligence; review the potential of existing international law to regulate these emerging technologies; and propose an international regulatory regime that would put the international world in charge of ensuring that low-probability, high-risk disasters never materialize.
The paper characterizes the Singularity Institute this way:
> the Singularity Institute for AI (“Singularity Institute”), established in part by former Pay Pal CEO Peter Thiel, teaches graduate students and executives about AI and engages in AI research and development
Singularity Institute was not “established” by Peter Thiel at all, it does not teach students or executives about AI, and it does not engage in AI “development.”
Luke thanks for catching this. I’m embarrassed I missed this when I read the paper. Anyways I think it’s not too late to get this corrected. We’ll see what we can do.
All versions have been corrected. Thanks so much, Luke. This is a prepublication version that will undergo fact checking before publication, so the error would have been caught, but I am glad to have found it now. To everyone else: I love suggestions and comments, so email me anytime at grant@gcrinstitute.org.
What a thought provoking paper. I’m interested to see how it impacts the international debate on GCRs from emerging tech. Thanks for posting this.
Thanks Sarah. Now that the paper is out, connecting with the international debates (and initiating some debates of our own) is a next step for us. Drop us an email if you have suggestions on this or would like to get involved.
Sarin is a chemical weapon, not a bioweapon…
Thanks Stuart. Nice catch.