Monday, March 16, 2015

Exploring Unmanned System Autonomy in the DoD

Editor's note: Reprinted with permission from the Naval Postgraduate School's CRUSER Newsby LCDR Nathaniel Spurr, NPS Systems Engineering Student, ncspurr(at) 

The objective of the Center for Technology and National Security Policy (CTNSP) symposium held on February 24th 2015 at the National Defense University was to foster an open, unclassified discussion regarding the potential that unmanned system autonomy has within the Department of Defense (DoD) in the 2025 timeframe. This topic was of critical importance to the development of SEA-21A’s integrated project that seeks to provide a recommended maritime system of systems (SoS) to support over-the horizon targeting (OTHT) in a contested littoral environment during the same period. The symposium began with a keynote address by General James E. Cartwright, USMC (Ret), former Vice Chairman of the Joint Chiefs of Staff where he emphasized the partnered role of autonomy and human interaction, followed by presentations from various discussion panels comprised of government, military, and industry subject matter experts to address autonomous system limitations, their associated operating environments, and perspectives on current and future military progress.

From the symposium’s outset, it was clear that one of the most important points when discussing autonomy (in any capacity) was to first properly define the term. In this day and age of unmanned systems and remotely piloted aircraft (RPA), the word autonomous is often misused. Additionally, it is important to note that there are few (if any) fully autonomous systems currently in use by the DoD. As Paul Scharre of the Center for New American Security (CNAS) defines it in his article “Between a Roomba and a Terminator: What is Autonomy?” “…autonomy is the ability of a machine to perform a task without human input.” He continues by defining an autonomous system as “…a machine, whether hardware or software, that, once activated, performs some task or function on its own.” This definition implies a certain level of “thinking” or inference by the autonomous machine, thereby suggesting that it is capable of learning. Therefore, describing unmanned systems like Northrop Grumman’s MQ-8 FireScout, for example, as “autonomous” isn't quite accurate. While some of its tasks such as takeoff and landing have been automated like many of our existing commercial and military aircraft have, they are not truly autonomous in nature because those tasks still require a human to interact with the system through the input of specific waypoints for navigation or defined parameters for takeoff and landing (e.g., desired glide slope, airspeed, and altitude). Additionally, in the performance of these tasks, these systems still require a human “on” the loop (i.e., human supervision), so the tasks, though automated, are not truly autonomous.

With the definitions of and differences between “autonomous” and “automated” established, much of the symposium’s discussion focused on the current state of unmanned systems and what progress might be seen in the DoD by 2025. It is important to note that it was of universal agreement by both the panel experts and the audience that implementation of autonomous lethality (or “weaponized autonomy”) in the DoD was unlikely for the foreseeable future due to the significant cultural, ethical, and policy concerns surrounding its use. Similarly, there was also mutual agreement across the symposium’s attendance that unmanned platforms will always augment manned platforms, with the former unlikely to completely replace the latter in DoD use. This also reinforces Scharre’s position that the term “Full Autonomy” (human “out” of the loop) is meaningless and that we should instead focus on “…operationally-relevant autonomy: sufficient autonomy to get the job done.” These associated levels of operationally-relevant autonomy will, therefore, continue to have a human either “in” or “on” the loop and keep the current and future DoD unmanned systems focused on the relationship between the human and the machine as autonomy continues to bring the information age into the robotics age.

All opinions expressed are those of the respective author or authors and do not represent the official policy or positions of the Naval Postgraduate School, the United States Navy, or any other government entity.

No comments:

Post a Comment