Are we entering an age of killer robots?


ASU's Dr. Heather Roff testifies at the UN

Dr. Heather Roff, researcher at ASU's Global Security Initiative, testifies at a UN meeting of experts to consider lethal autonomous weapons. Dr. Roff's testimony focuses on unintentional risks of autonomous weapons systems, including artificial intelligence and human control. The meeting is being held under the auspices of the Convention on Certain Conventional Weapons from April 11-15 in Geneva. Photo courtesy of Heather Roff

The world’s militaries are close enough to wielding weapons that decide on their own whether to kill — called Lethal Autonomous Weapons Systems — that the United Nations this week gathered a select few experts, including an ASU researcher, to separate fact from fear of robot overlords.

Dr. Heather Roff, research scientist with ASU’s Global Security Initiative and current Oxford senior research fellow, is testifying on the unintended risks of such weapons, and where many autonomous or semi-autonomous weapons currently exist. Her research focuses on artificial intelligence and human control.

Question: Why would we want weapons systems that control themselves? Do we have some already?

Answer: There are real potential advantages for the use of autonomous weapon systems (AWS), including force protection and low cost — in both monetary and human terms. There is also the increased ability to maneuver in “denied” environments, like underwater or in areas with jammed communications. In this way, AWS could be a “force multiplier” — that is, maximizing the effects and impacts of force without having a person to do it. However, the concerns outweigh these potential benefits. Use of AWS could lead to an artificial-intelligence “arms race,” where states attempt to realize stronger and stronger AI to outmatch their opponents. This alone is troublesome, but there are other grave risks too, such as the risk of malfunction or fratricide, the escalation of a conflict, increased levels of deniability, stressors on accountability and the potential for mass surveillance of populations. The potential for mass human-rights abuses overshadows short-lived advantages in a denied warfighting space. 

Q: Are we at risk of a real-life Skynet?

A: The key issue here is whether states want to delegate the decision to kill to a machine. If they do not want to relinquish that authority to a tool, and reserve that only for human beings, then they cannot support AWS. I understand that many people may think of AWS as some sort of precursor to “Skynet,” but I don’t think that is a helpful analogy. If we jump from AWS — something that can engage and fire without human intervention — to Skynet, then we fail to see the issues that are in front of us now: systems that may not be easily testable under combat conditions, are not verifiable, may not be predictable or reliable, systems that would breed incentives for human-rights abuses, or could trigger conflicts due to unintended engagements.  

Q: When we think about autonomous weapon systems, people may think of “drones.” Are they right?  

A: AWS are not “drones.” A remotely piloted aircraft — or RPA — is piloted. In other words, there is a human there from the outset.  Not only is a human piloting the aircraft, there are many other humans engaged with the operation of and decision to use lethal force. As to whether “drones” would become autonomous, the answer is “sure.” Autonomy is a capability. With the right software, we can make unmanned systems autonomous. Look to, for example, the U.S. Navy’s swarm boats or Google’s self-driving car. The difference, however, is that those vehicles are not armed and making lethal decisions. I think autonomy will challenge diplomacy because of difficulties with transparency, signaling intent between states, showing resolve and permitting confidence-building measures. The ubiquitous use of these systems could change the face of conflict in ways we don’t yet fully understand.

To schedule an interview with Dr. Roff, please contact Logan Clark at the ASU Office of Media Relations and Strategic Communications at mediarelations@asu.edu

More Local, national and global affairs

 

Dramatically lit chess pieces on a board

ASU selected to support DOD Irregular Warfare Center

Arizona State University has been selected to work closely with the U.S. Department of Defense to provide reputable academic…

Side of a butte seen at night lit up by lanterns along a path and in the shape of a large "A."

Inspired by university tradition, a new lantern walk debuts with focus on mental health in the military

Arizona State University’s Lantern Walk is one of the school’s oldest and most treasured traditions. Each year, students, alumni…

ASU students stand around waiting to vote

ASU-led study examines effects of political turmoil on young voters

Young adults approaching voting age in the U.S. are doing so in a political climate unlike any other.Over the past four years,…