The call by some states and civil society for the regulation of autonomous weapons continues. But concern is also being expressed that investment and research in autonomous weapons systems are outpacing regulation. In advance of the August meetings of the Group of Governmental Experts at the Convention on Certain Conventional Weapons (CCW), here’s an overview of recent developments.
1. The Luxembourg Declaration
In early July, a declaration by the Organization for Security and Co-operation in Europe (OSCE) Parliamentary Assembly (the Luxembourg Declaration) “urges participating States to support international negotiations to ban lethal autonomous weapons with a view to establishing international, legally binding rules.”
Although it’s non-binding, the declaration does provide guidance to members of the OSCE, to which Canada belongs. It is a welcome sign that autonomous weapons remain on the radar of parliamentarians, particularly those in Germany, Liechtenstein, and Switzerland, who appeared to support the ban.
2. New UN report
On June 10, the UN Secretary-General’s High-level Panel on Digital Cooperation—a panel of 22 experts co-chaired by Melinda Gates, of the Bill and Melinda Gates Foundation, and Jack Ma, founder of the Chinese tech giant Alibaba—issued its report.
The Age of Digital Interdependence supports the Secretary-General’s call for a ban on autonomous weapons by urging the maintenance of human agency: “We believe that autonomous intelligent systems should be designed in ways that enable their decisions to be explained and humans to be accountable for their use…. Life and death decisions should not be delegated to machines.”
3. New commitments by the Netherlands and Finland
In May, the Dutch parliament adopted a resolution calling for binding international regulations on new weapons technologies, including autonomous weapons. According to Pax, a Dutch NGO and one of the co-founders of the Campaign to Stop Killer Robots, the resolution received overwhelming support from all but one of the political parties in parliament. As Pax notes, this sent an important signal to the Dutch government. It will be interesting to see if the Dutch delegation to the CCW champions this cause at the August meetings.
On June 3, a new Finnish government published its coalition program, which includes the intention to pursue regulations “to ban the development and production of weapons systems based on artificial intelligence.” This decision was welcomed by the Peace Union of Finland and the Campaign to Stop Killer Robots.
4. Possible cooperation from Russia?
Russia’s position on militarized artificial intelligence and autonomous weapons could be softening—maybe.
At an international conference in Moscow In late April, Russian Security Council Secretary Nikolai Patrushev stated, “We believe that it is necessary to activate the powers of the global community, chiefly at the UN venue, as quickly as possible to develop a comprehensive regulatory framework that would prevent the use of the specified [new] technologies for undermining national and international security.” He went on to say that “Russia is ready to take part in this process.”
Patrushev’s statement must have surprised everyone who follows news on Russian defence or has attended CCW meetings. To date, Russia has been a loud voice in the CCW meeting room against the need for regulation of autonomous weapons. With some other countries, Russia has suggested that extant international law is sufficient to regulate autonomous systems.
And, at the informal June consultation of the CCW Group of Governmental Experts on lethal autonomous weapons systems in Geneva, it did not appear that Petrushev’s views were represented or that Russia was willing to take part in the UN process.
So, has anything changed? Something else to watch for at the August meetings.
5. New systems grow military demand
Weapons manufacturers are showing off increasingly autonomous weapons systems that major militaries seem keen to acquire. And concern is growing that humankind is already sliding down the slippery slope to autonomous systems.
More sophisticated unmanned aerial vehicles (or drones) are being developed. While many can function in autonomous mode, a human remains in charge over key functions—for now. But the future could be quite different.
Responding to news that the U.S. Army’s Artificially Intelligent Targeting System (ATLAS) was being used to develop an automated turret, Stuart Russell, University of California professor and opponent of autonomous weapons, said, “If they are explicitly contemplating a human authorizing an attack on a whole group of targets, then the algorithm is making the target recognition and firing decisions. After that it’s a short journey to fully autonomous tanks, ground-attack air vehicles, etc.”
Russia is also in the game. At a major arms convention in Moscow in June, Russian arms makers showed some weapons systems that can function autonomously. Analysts suggest that Russia has been testing some of these robotic systems in Syria.
So, will the CCW meeting in August lead to any significant steps in the regulation of autonomous weapons? It does not appear likely that two days can change the current trajectory of the stalling discussions at CCW. Still, political and civil-society developments over the past several months do indicate that there will be continued pressure on governments to establish a robust regulatory framework, including a ban on fully autonomous weapons systems.
One thing is certain. Civil-society groups, including academics and researchers, will be at the CCW, reminding countries that time is running out to regulate the mounting tide of autonomous weapons.
Photo: In a scene from “Slaughterbots,” a video produced by the Future of Life Institute to illustrate the dangers of autonomous weapons, large drones can blow holes in walls to allow smaller drones to get in and hunt down their targets. (Still image from YouTube video)