Robo-Ethics: Exploring Ethics of Unmanned Combat Systems
by Kenneth Stewart, NPS, kastewar(at) nps.edu
Students and faculty from the Naval Postgraduate School (NPS) and the U.S. Naval Academy (USNA) recently came together with teams of junior officers from U.S. Navy Third Fleet to discuss the ethics of unmanned systems for the 2015 iteration of the Robo-Ethics Continuing Education Series. This year’s event was led via video teleconference by NPS Associate Professor Ray Buettner, April 14.
“We are interested in exploring the ethical boundaries of robotic systems … preparing tools to figure out what the future will be like,” said Buettner.
But as student and faculty researchers wade into the at-times turbulent waters of unmanned systems, they are also exploring the many ethical considerations that autonomous combat systems present. “Should a machine be able to decide to kill, and if so, what does ‘decide’ mean?” Buettner asked assembled students and others joining via video teleconference from USNA and elsewhere. “The key concept to consider may be, ‘where is the human relative to the selection of the target and the decision to engage,’” said Buettner. “Do we want discrimination authority granted to the human loop?”
Another area of concern being debated is the question of punishment and accountability. Researchers, ethicists and policy makers are asking questions like, ‘Who do we hold accountable when a lethal autonomous system engages the wrong target?’
While it may seem counterintuitive to debate whether or not a human should be “in the decision loop,” Buettner points to serious debates among ethicists as to whether or not humans or machines are more likely to make errors that cost human life.
Coincidentally, while Buettner and his group debated the ethics of unmanned systems, the United Nation’s Convention on Certain Conventional Weapons (CCW) was meeting in Geneva to debate a proposed ban and moratorium on Lethal Autonomous Weapons Systems (LAWS).
Buettner believes that there is currently no need for a prohibition against lethal autonomous systems, noting that current law already adequately provides necessary safeguards in this area. He is referring in part to Directive 3000.09, which the DOD published in 2012 to provide guidance on the development of autonomous systems. The directive places a series of regulatory safeguards on autonomous systems development while simultaneously encouraging innovative thinking and development in the autonomous systems arena.
“So far, no country has declared an intent to deploy a totally autonomous lethal system that decides who to kill and when,” Buettner noted. “Almost all fully autonomous systems are defensive.”
Buettner also noted NPS Professor Wayne Hughes’ views on the rapidly changing nature of autonomous systems. “The fundamental error in a debate over robotic development is to think that we have choice,” quoted Buettner. “This world is coming, rapidly coming.
“We can say whatever we want, but our opponents are going to take advantage of these attributes,” he continued. “That world is likely to be sprung upon us if we don’t prepare ourselves.”
NPS Assistant Professor Timothy Chung has long recognized the utility of research in this area. He is a pioneer in the area of unmanned aerial vehicle (UAV) swarms. “How do we take evolutionary changes in UAVs and use them to achieve revolutionary effects?” asked Chung.
In addition to exploring the ethics of unmanned combat systems, Buettner and Chung showcased ongoing CRUSER initiatives, many of which were born of student research. Current projects include the use of QR Codes in network-deprived environments and the feasibility of wireless underwater computer networks.
Editor's note: Reprinted with permission from the Naval Postgraduate School's CRUSER News. All opinions expressed are those of the respective author or authors and do not represent the official policy or positions of the Naval Postgraduate School, the United States Navy, or any other government entity
Students and faculty from the Naval Postgraduate School (NPS) and the U.S. Naval Academy (USNA) recently came together with teams of junior officers from U.S. Navy Third Fleet to discuss the ethics of unmanned systems for the 2015 iteration of the Robo-Ethics Continuing Education Series. This year’s event was led via video teleconference by NPS Associate Professor Ray Buettner, April 14.
“We are interested in exploring the ethical boundaries of robotic systems … preparing tools to figure out what the future will be like,” said Buettner.
But as student and faculty researchers wade into the at-times turbulent waters of unmanned systems, they are also exploring the many ethical considerations that autonomous combat systems present. “Should a machine be able to decide to kill, and if so, what does ‘decide’ mean?” Buettner asked assembled students and others joining via video teleconference from USNA and elsewhere. “The key concept to consider may be, ‘where is the human relative to the selection of the target and the decision to engage,’” said Buettner. “Do we want discrimination authority granted to the human loop?”
Another area of concern being debated is the question of punishment and accountability. Researchers, ethicists and policy makers are asking questions like, ‘Who do we hold accountable when a lethal autonomous system engages the wrong target?’
While it may seem counterintuitive to debate whether or not a human should be “in the decision loop,” Buettner points to serious debates among ethicists as to whether or not humans or machines are more likely to make errors that cost human life.
Coincidentally, while Buettner and his group debated the ethics of unmanned systems, the United Nation’s Convention on Certain Conventional Weapons (CCW) was meeting in Geneva to debate a proposed ban and moratorium on Lethal Autonomous Weapons Systems (LAWS).
Buettner believes that there is currently no need for a prohibition against lethal autonomous systems, noting that current law already adequately provides necessary safeguards in this area. He is referring in part to Directive 3000.09, which the DOD published in 2012 to provide guidance on the development of autonomous systems. The directive places a series of regulatory safeguards on autonomous systems development while simultaneously encouraging innovative thinking and development in the autonomous systems arena.
“So far, no country has declared an intent to deploy a totally autonomous lethal system that decides who to kill and when,” Buettner noted. “Almost all fully autonomous systems are defensive.”
Buettner also noted NPS Professor Wayne Hughes’ views on the rapidly changing nature of autonomous systems. “The fundamental error in a debate over robotic development is to think that we have choice,” quoted Buettner. “This world is coming, rapidly coming.
“We can say whatever we want, but our opponents are going to take advantage of these attributes,” he continued. “That world is likely to be sprung upon us if we don’t prepare ourselves.”
NPS Assistant Professor Timothy Chung has long recognized the utility of research in this area. He is a pioneer in the area of unmanned aerial vehicle (UAV) swarms. “How do we take evolutionary changes in UAVs and use them to achieve revolutionary effects?” asked Chung.
In addition to exploring the ethics of unmanned combat systems, Buettner and Chung showcased ongoing CRUSER initiatives, many of which were born of student research. Current projects include the use of QR Codes in network-deprived environments and the feasibility of wireless underwater computer networks.
Editor's note: Reprinted with permission from the Naval Postgraduate School's CRUSER News. All opinions expressed are those of the respective author or authors and do not represent the official policy or positions of the Naval Postgraduate School, the United States Navy, or any other government entity
Comments
Post a Comment