The ‘Black Mirror’ spectre and autonomous weapons

June 6, 2018

THINKING ABOUT THE FUTURE OF WARFARE THROUGH SCIENCE FICTION

Laughter doesn’t often fill the meeting room at the United Nations. When it does, it’s a refreshing reminder that country delegates are ordinary citizens, just like the rest of us. While they have been given a great privilege and responsibility to represent their nations’ interests and serve the planet, they also watch Netflix.

This past April, at the Group of Governmental Experts meetings at the Convention on Certain Conventional Weapons in Geneva, Switzerland, a country delegate was participating in a discussion on autonomous weapons systems. While talking about the range of such systems, he slipped and said “spectre” instead of “spectrum.”

In thanking the delegate, the Chair of the Meetings picked up on this slip, remarking that the spectre hanging over the discussions in the room was the Black Mirror television series. Everyone laughed.

References to science fiction are common when discussing autonomous weapons systems. Some critics claims that they are only found in fiction. Others fear precisely those dystopian scenarios depicted by creative writers. But the Black Mirror reference at that meeting in Geneva got me wondering: How useful is science fiction in thinking about the future of warfare?

BLACK MIRROR: THE SERIES

Black Mirror is a British science-fiction television series that has gained worldwide popularity. The series explores different fears about the future, particularly the role of technology. Many episodes leave the viewer anxious, with no proffered solutions and a bleak future that seems only too likely.

The episode “Metalhead” features robot dogs (inspired by the Boston Dynamics robot dog) that are fully autonomous and able to track and kill humans. Still, Black Mirror creator Charlie Brooker, speaking at the True North tech conference held in Waterloo from May 29-31,  asserted with a chuckle that the series is not a “how-to” manual.

What science fiction and fiction in general do is offer a way for society to reflect on and think about the future. While not all such thinking needs to be dystopian, there is a clear value in constructing more scenarios about the future of warfare and the humanitarian implications of increasingly autonomous weapons systems. After all, militaries develop different possible scenarios as they strategize about warfare. Civil society should also create fictional constructs that help us to think about emerging security trends and how to respond to these trends and protect civilians.

THE USEFULNESS OF FICTION

Ulrike Franke, a policy fellow at the European Council on Foreign Relations, notes, “In fact, we should take science-fiction writing and other literary predictions about warfare seriously, understand their impact on public opinion, and use it as a tool to inform as well as to entertain.” Since public perceptions of technology are shaped by science fiction, it can be an important common frame of reference for policymakers and the public.

However, as Mark Gubrud points out, “The problem with science fiction as a source or vehicle for ‘thinking about the future of war’ is that it encourages people to believe that what it suggests or says is grounded in science rather than fiction.” Gubrud’s point is important. Only a clear understanding of reality, grounded in science and technology research, can usefully inform our discussions on the future of war.

Of course, not all sci-fi is equally useful. The wildly popular 1984 film The Terminator, about a cyborg killer, is often derided by proponents of autonomous weapons technology. But the film is also not useful to those concerned with preserving control of weapons systems, because it does not support a realistic discussion of current and future weapons technology. In fact, the Terminator scenario is often used to dismiss very important questions about the quality of human control over new weapons systems, such as the extent to which the human operator is relying on computer-selected data to take action. In other words, civil society is questioning this loss of human control and overreliance on computer systems and is not suggesting that self-aware AI will take over the world, as in The Terminator.

At a panel on future technology and entertainment at the True North conference, the responsibility of science-fiction creators and writers came up for discussion. After all, these storytellers help to shape perceptions about technology, especially for kids and youth. But it was not clear that creators acknowledge such a responsibility.

Still, thinking about how to use science fiction to explore the changing nature of warfare remains a profitable line of inquiry. The Black Mirror series is set in the near future and depicts changes society is already experiencing. Many of the sources of our anxiety and fear are vividly portrayed, easy objects for discussion.

New developments in military and security applications are already affecting ordinary people. Their stories, the consequences of such technology, should be imagined. Focusing on people, rather than technology or weapons systems, will help most in thinking about the future of war. With luck, among the scenarios that emerge will be new ways to create more peaceful societies.

Photo  – from Black Mirror episode “Metalhead” – by Jonathan Prime/Netflix


From Blog

Related Post

Get great news and insight from our expert team.

Canadian arms ban on Israel: Step in the right direction but no silver bullet

AI targeting in Gaza and beyond

Let's make some magic together

Subscribe to our spam-free newsletter.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.