GCR News Summary February 2016

24709146763_4308306c1f_z

Stop Trident demonstration in London image courtesy of David Holt under a Creative Commons license

Tens of thousands of people gathered in London’s Trafalgar Square to protest the renewal of Britain’s Trident nuclear submarine program. It was the largest anti-nuclear demonstration in England since 1983, when several hundred thousand people demonstrated against the deployment of cruise missiles at Greenham Common. Labour leader Jeremy Corbyn told the protesters they should not forget that a nuclear war would mean “absolute destruction on both sides” and said that he wanted a “Labour government that would adhere to all the articles of the non-proliferation treaty”. In a blog post defending the decision to renew the program, British defense minister Philip Dunne said that disarming “would be a reckless gamble with our national security that would play into the hands of our enemies”.

The UN open-ended working group on nuclear disarmament (OEWG) met for five days in Geneva. The OEWG, which consists of both states and non-governmental organizations, was originally established in December 2012 by the UN General Assembly “to develop proposals to take forward multilateral nuclear disarmament negotiations for the achievement and maintenance of a world without nuclear weapons”. One of the main topics discussed was the possibility of a nuclear ban treaty (NBT) that would prohibit the production, deployment, and use of nuclear weapons on humanitarian grounds. No nuclear armed states participated in the meeting. The OEWG will meet again in early May.

President Obama proposed in his 2017 budget to cut $132 million in spending on non-proliferation programs while increasing spending on nuclear weapons programs. The cuts would include a $90 million cut to the Global Material Security Program, which works to keep nuclear and radiological material secure around the world. The US also conducted two intercontinental ballistic missile (ICBM) tests in February. Deputy Defense Secretary Robert Work told reporters that the test was intended to send a signal “that we are prepared to use nuclear weapons in defense of our country if necessary”.

North Korea successfully put a satellite into orbit in a test of its Unha-3 space launch vehicle. The Unha-3 could be probably used as an ICBM. 38 North estimated that the Unha-3 could carry a payload of nearly a metric ton around 10,000 km (around 6,200 miles). With that range a North Korean missile could hit Hawaii, Alaska, or even the west coast of the US. North Korea recently tested a nuclear weapon. According to South Korean intelligence, North Korea may be preparing another nuclear weapon test. But North Korea has not tested a reentry vehicle that would be capable of carrying a nuclear payload back into the atmosphere. Nor does the Unha-3 appear accurate enough to hit a precise target. Additionally, because it takes days of preparation to launch an Unha-3, it is probably a long way from being useful as a weapon.

George Kulacki wrote in a Union of Concerned Scientists report that China may be considering putting its nuclear forces on high alert so that it can retaliate immediately if it appears to be under nuclear attack. China has in the past stressed ensuring that its nuclear forces would survive a first strike over being ready to return an attack quickly. Both the US and Russian nuclear forces already have similar “launch on warning” postures. While a launch-on-warning policy makes it more credible that China would retaliate if another country launched a nuclear first strike, as The Union of Concerned Scientists notes, it also “raises the risk of an accidental, mistaken, or unauthorized nuclear launch, as evidenced by dozens of close calls in the United States, Russia, and former Soviet Union”.

In testimony before the Senate Armed Services Committee about the intelligence community’s annual Worldwide Threat Assessment, Director of National Intelligence James Clapper said that “unpredictable instability” is “the new normal”. The report said that “increased reliance on AI for autonomous decision making is creating new vulnerabilities to cyber attacks and influence operations”. The report also added genome editing to its list of potential weapons of mass destruction. The report said that because genome editing tools are cheap and widely available, they probably increase “the risk of the creation of potentially harmful biological agents or products”. Piers Millet, a bioweapons expert at the Woodrow Wilson Center, told the MIT Technology Review that listing genome editing as a threat was “a surprise” because producing biological weapons requires expertise in a “wide raft of technologies” and that “for the foreseeable future, such applications are only within the grasp of states”.

An advance draft of a new UN report warned that an outbreak of a virulent, communicable pathogen today could be worse than the 1918 “Spanish Flu” pandemic that killed between 50 and 100 million people. The report noted that modeling by the Bill and Melinda Gates Foundation suggested that a virulent strain of airborne influenza could kill more than 33 million people in just 250 days. The report calls for the World Health Organization (WHO) to build a new Centre for Emergency Preparedness and Response with command and control capability and access to the resources necessary to respond to a public health emergency. The report said that “the high risk of major health crises is widely underestimated and… the world’s preparedness and capacity to respond is woefully insufficient.”

New research in Proceedings of the National Academy of Sciences found with 95% confidence that sea levels increased more in the 20th century than in any other century in the last 2,800 years. Seas rose about 14 cm (5.5 in) from 1900 to 2000. NASA estimates sea levels are currently rising 3.4 mm a year (0.13 in)—more than twice the average rate of the 20th century—which suggests that the rise in sea levels is accelerating. The study also found with 95% confidence that sea levels would have risen at most less than half as quickly if human activity were not warming the planet. The study projects that the rise of sea levels continue accelerating in the 21st century, even if we manage to reduce carbon dioxide emissions. The paper found that if emissions continue to rise through throughout the century, sea levels could rise as much as 1.3 m (4.3 ft).

Mark Riedl and Brent Harrison presented a paper at the Association for the Advancement of Artificial Intelligence (AAAI) conference arguing that artificial intelligent machines could learn human ethics by reading stories that model social values. Riedl and Harrison taught an agent to emulate the protagonist in stories by giving the agent a reward signal when it made similar choices. Artificial intelligence (AI) researchers worry that unless an intelligent agent’s values are aligned with those of humans, it could intentionally or unintentionally act in ways that hurt us. But the complexity of real-world situations makes it practically impossible to program ethical behaviors for  every imaginable situation. Riedl and Harrison note that their technique works best for agents that face a limited set of choices, but may not work as well for artificial general intelligence able to address a broad range of contingencies. “The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels and other literature,” Riedl said. “We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose.”

In a Center for a New American Security report, Paul Scharre argued that autonomous weapons—weapons systems that choose on their own what targets to engage—could be difficult to control in practice. Autonomous systems could go wrong for reasons that range from “simple malfunctions and software bugs to more complex systems failures, changing environmental conditions, hacking, and human error.” Scharre argues that instead of purely autonomous weapons, militaries should conduct what he calls “Centaur Warfighting”, in which humans and machines work closely together. “Anyone who has ever been frustrated with an automated telephone call support helpline, an alarm clock mistakenly set to ‘p.m.’ instead of ‘a.m.’ or any of the countless frustrations that come with interacting with computers,” Scharre wrote, “has experienced the problem of the ‘brittleness’ that plagues automated systems.”

This news summary was put together in collaboration with Anthropocene. Thanks to Tony Barrett, Seth Baum, Kaitlin Butler, Matthijs Maas, and Grant Wilson for help compiling the news.

For last month’s news summary, please see GCR News Summary January 2016.

You can help us compile future news posts by putting any GCR news you see in the comment thread of this blog post, or send it via email to Grant Wilson (grant [at] gcrinstitute.org).

This post was written by
Robert de Neufville is Director of Communications of the Global Catastrophic Risk Institute.
Comments are closed.