Kargu-2 debate raises awareness of autonomous weapons

July 15, 2021

By Branka Marijan and Emily Standfield

Over the past few months, experts have been surprised by the media attention given to the Turkish-made Kargu-2 kamikaze drone or loitering munition. Everyone, it seems, wants to know if the use of the Kargu-2 in Libya in March 2020 was the first instance of an autonomous weapon being used in conflict.

Now, it is not clear that the Kargu-2 is truly autonomous. According to the definition used by several states in discussions on autonomous weapons at the United Nations Convention on Certain Conventional Weapons (CCW), which have been ongoing since 2014, the critical functions of the weapon system—target selection and engagement—must be performed by the system itself, with no human in remote control. However, at present, the manufacturer and Turkey are keeping technical details of how the Kargu-2 operates and operated in Libya close to their chests.

Perhaps the most significant point is that more people are now aware of the real possibility of autonomous weapons. Whether this new awareness and public understanding will result in increased pressure on states to regulate these systems remains to be seen. Surveys by the Campaign to Stop Killer Robots have consistently shown that more than 60 per cent of respondents oppose the use of autonomous weapons. At the very least, the media attention and public engagement on the Kargu-2  shows that public opinion must be considered, even in highly technical debates. And states must, with a new sense of urgency, proceed with international regulation of this emerging category of weapons.

WHAT DO WE KNOW ABOUT THE KARGU-2?

On March 8, 2021, the UN Panel of Experts on Libya released a report documenting the use of Kargu-2 in Libya the previous year. The report states that the kamikaze drone, built by Turkish state-owned company STM, “hunted down and remotely engaged” soldiers of Hafter Affiliated Forces. It then says, “The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability.”

But matters are not as clear as they appear. It is not certain that the drone was acting autonomously or that it harmed anyone. The UN report implies that there were losses but does not confirm any. All that is known for certain is that the Kargu-2 has the capability for some autonomous functions.

Ozgur Guleryuz, CEO of STM, says that the Kargu-2 is not designed to launch fully autonomous attacks on targets. Instead, its autonomous technology “is mostly used for navigation purposes as well as designating and differentiating humans, animals, vehicles, etc.” When Kargu-2 identifies a target, an operator must manually verify the target and launch the attack. According to Guleryuz, “Unless an operator pushes the button, it is not possible for the drone to select a target and attack.”

During informal virtual CCW exchanges between June 28 and July 2, a Turkish delegate reiterated that the UN Panel of Experts was wrong in claiming that the Kargu-2 was an autonomous weapon. He referred to the munition as remotely controlled. Because these exchanges were informal, they do not necessarily reflect the official position of the Turkish government.

However, an STM video of the Kargu-2 shows it honing in on a target and exploding among mannequins. Presumably the footage illustrates capabilities to accurately hit targets, including individual persons. Experts note that this kamikaze drone has been touted for its anti-personnel capabilities. Indeed, former STM General Manager Murat Ikinci has stated that each Kargu-2 is equipped with artificial intelligence and facial recognition technology.

Previously, company executives have mentioned that STM is developing swarming capabilities, which would allow the munitions to communicate with each other and make decisions independently. How well the technology works and whether it can accurately distinguish between combatants and non-combatants are not yet publicly known.

If these loitering munitions do indeed need human operators to select and engage targets, they do not  fit the category of autonomous weapons, according to the dominant international definition. But what if these munitions have the ability to swarm? In that case, the central questions focus on how targets are selected and engaged and the degree of autonomy present. Many countries want to know if humans or objects are being targeted. The Kargu-2 seems to illustrate the complexity of the debate—and the technology, highlighting the difficulty in distinguishing between human-operated and autonomous technologies.

WHY PUBLIC PERCEPTIONS MATTER

All the technical questions surrounding the Kargu-2 will be closely explored by experts, including civil society groups and individuals who closely follow CCW discussions. But now media attention has made a much wider public aware of autonomous weapons as a threat that is imminent and pervasive.

The technology that is key to autonomous weapons is more accessible and more easily comprehended by ordinary consumers than other weapons technologies. Consider facial recognition, which is used to unlock phones, or the technology in autonomous and semi-autonomous vehicles. During the global pandemic, the use of drones and other surveillance technologies has become widely used and there has been growing scrutiny and concern about impacts on privacy.

While UN discussions on autonomous weapons might seem to be happening in a vacuum, the recent use of the Kargu-2 drone shows the opposite to be true. The world might just have witnessed the first use of a significantly autonomous weapon. Now it is watching the development of a new public understanding on how AI and new multi-use tech are changing warfare. The world needs this public awareness.

For too long, technical knowledge of autonomous weapons has been kept in a silo that has been accessed by only a select few. The work done by civil society organizations and academics has been key to breaking down the barrier and the recent attention has crystallized the concerns voiced by these groups. Now, people around the world see autonomous weapons as  a real, urgent problem in need of solutions. The next step is for them to convince their governments to act.

From Blog

Related Post

Get great news and insight from our expert team.

How to use open-source intelligence to get to the truth

No Canadian leadership on autonomous weapons

Let's make some magic together

Subscribe to our spam-free newsletter.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.