Thomas E. Simmons

The more decisive a weapon is
The more surely it will be used,
And no agreements will help.

JOHN ADAMS, DOCTOR ATOMIC Act I (2005)
(libretto by Peter Sellars)

Introduction

In Forrest Gump, Gary Sinise played the character of Lieutenant Dan Taylor, a physically and psychologically wounded Vietnam War veteran.[1] In the more than twenty years since the movie was released, Sinise has leveraged the emotional resonance of his character to support various veterans’ organizations.[2] In recognition of his efforts, he has been named an honorary Navy Petty Officer and an honorary Marine. His Lt. Dan Band (in which Sinise plays bass) is named after the character that he made famous.[3]

There are many voices in the contemporary debate over lethal autonomous weapons systems and whether their use should be outlawed in warfare, but Lieutenant Dan’s voice—the voice of veterans who bear the costs and scars of battle—has not been heard. It deserves to be. Autonomous weapons systems (“AWS”)[4] could improve the effectiveness of remote-controlled drones and reduce the severity of widespread posttraumatic stress disorders suffered by drone pilots.[5] This essay sketches the etiology and epidemiology of posttraumatic stress disorder (PTSD), outlines the current state of the debate on AWS, and concludes with a rejection of the calls for an international treaty banning autonomous weapons. Because popular culture, especially films, shape to a large degree popular conceptions and misconceptions of AWS, references and citations to popular culture inform this discussion.[6]

I. An Overview of Posttraumatic Stress Disorder

One Vietnam War veteran, haunted by his combat experiences, declared, “I was not an animal or some kind of killing machine . . . I’m still trying to qualify as a human being.”[7] PTSD was recognized towards the end of the Vietnam War; prior to that time its manifestations were called “shell shock.”[8] PTSD is an anxiety disorder that may develop after exposure to a traumatic event such as combat, even among individuals without any predisposing conditions.[9] Witnessed events as well as experienced events can cause PTSD, and the symptoms “may be especially severe or long-lasting when the stressor is interpersonal and intentional (e.g., torture, sexual violence).”[10]

The symptoms are varied. Some veterans suffering from PTSD will exhibit “fear-based re-experiencing” and behavioral manifestations.[11] Others will “flat line” emotionally and lose the ability to experience pleasure (anhedonic mood states) or suffer profound unease (dysphoric mood states).[12] Some will disassociate, with some, arousal symptoms predominate; still others will exhibit a complex combination of multiple symptoms.[13] Symptoms may manifest themselves a few months or several years after the traumatic event or events.[14]

The view of PTSD in popular culture packages uncontrollable and distressing flashbacks where the trauma is relived and re-experienced.[15] Flashbacks may last a few seconds or even days in which the traumatic event is relived and re-experienced.[16] In the film The Deer Hunter, the main character and his friends are captured by sadistic North Vietnamese fighters and forced to play Russian roulette.[17] He survives and they escape, but one friend is so psychologically damaged by the episode that he ultimately commits suicide.[18]

The type of trauma that can trigger PTSD varies in intensity from individual to individual. Both intentional acts (e.g., criminal assaults) and unintentional acts (e.g., natural disasters) can cause PTSD. Yet intentional acts appear to have a greater capacity to cause PTSD.[19] Combat experience is frequently the cause of the disorder. Bomber pilots suffer PTSD even though they are more removed from the battlefield than soldiers.[20] Contemporary data shows that drone operators, who are safely stateside while directing a drone plane on the opposite side of the globe, also suffer from PTSD—although the occurrence of the disorder may be statistically less compared to those soldiers engaged in hand-to-hand combat.[21] One drone pilot described the emotional trauma he experienced in having to select targets and release munitions, with the very real possibility of killing innocent civilians, as “chill[ing] the marrow of my being.”[22] In a sense, although drone pilots are removed geographically from the actual battlefield, they engage in a more intimate kind of war, viewing their targets through high magnifications and tracking them for extended periods.[23]

II. International Rules of Engagement

The existing framework of international humanitarian law neither permits nor prohibits AWS, but at least four provisions have possible application. Wartime attacks must satisfy several criteria: distinction between soldiers and civilians, military necessity, proportionality, and humanity.[24] The Martens Clause in the preamble of Protocol II of the Geneva Convention also bans any weapons that violate the “principles of humanity and the dictates of the public conscience.”[25] These provisions of the Geneva Convention serve to contain civilian deaths and the destruction of private property, yet they also protect combatants. The principles are honored by the military in part “because they protect our . . . troops’ humanity.”[26]

Recently, the United Nations held a series of meetings in Geneva to consider an international treaty to limit or ban lethal autonomous weapons systems (“LAWS”) as the U.N. did with blinding laser weapons in 1995.[27] The United States has suggested that a LAWS treaty is unnecessary because existing international law is a sufficient moderator of the use of AWS in wartime.[28] The state parties to the meetings appear to generally agree about requiring some level of human control over LAWS, but disagree as to the minimum level of human control that should be legally required.[29] Although a fully automated war system where retaliatory missiles continue to be launched against an enemy even after all the combatants have died ought to be avoided, such a scheme would violate the already-existing rule of military necessity and not necessarily require a ban on autonomous weapons.[30] Indeed, a fully automated war system—such as that portrayed in Battlestar Galactica where robots (“cylons”) go to war with humans[31]—would presume a total failure of the chain of command. “All pilots are subject to orders from their commanders,” observes Major DeSon.[32] “Robot pilots should be no different.”[33] Semiautonomy will reserve strategic decision-making, including the ultimate political decision of whether to wage war, exclusively in human actors.

There is already some loss of meaningful human oversight in many existing and even ancient weapons systems, a point where the projectile and its lethality can no longer be recalled: An arrow launched from its bow; a live grenade tossed into a tank’s hatch; a nuclear weapon released from a bomber. The weapon, in its arc to causing destruction, is effectively autonomous, although the timing of the release itself was produced by a conscious human decision.

Weapons systems that cause destruction a significantly long time after the human activation decision tend to cause humanitarian concerns.[34] Weapons with an especially long fuse often violate principles of discrimination between civilians and combatants. Land mines and booby traps do not discriminate—as human soldiers can—between children and adults or between uniformed soldiers and farmers. Here the problem is more one of imprecision than delay. If, theoretically, a land mine could be designed to only deploy during a period of active aggression and to contain a fail-safe mechanism to avoid detonation when stepped on by a civilian, the principle of discrimination would appear to be satisfied, although land mines could still be—and, in fact are—the subject of an outright ban due to deeper humanitarian concerns.[35]

III. Autonomous Killer Robots as a Means of Minimizing Combatants’ PTSD

The most chilling description of the possible trajectory endpoint for AWS technology was made by Professor Stuart Russell of Berkeley in the journal Nature:

[A]s flying robots become smaller, their manoeuvrability [sic] increases and their ability to be targeted decreases. They have a shorter range, yet they must be large enough to carry a lethal payload—perhaps a one-gram shaped charge to puncture the human cranium. Despite the limits imposed by physics, one can expect platforms deployed in the millions, the agility and lethality of which will leave humans utterly defenceless [sic].[36]

He concludes flatly: “This is not a desirable future.”[37]

The scene envisioned by Professor Russell is not a desirable future. But the question this horrible image posits in terms of the AWS debate is whether it is any less horrible for one million individual drone pilots to pilot these bumblebee-sized bombers from a safe distance than it is for the drones to fly autonomously.[38] Requiring human drone pilots to do a job that robotic pilots could accomplish just as well results in the same casualties and damage to the targets. The primary difference between remotely piloted drones and AWS is the absence of close human involvement and the resulting diminished psychic cost to those humans who would otherwise guide and maneuver the drones towards their targets. Replacing drone pilots with AWS may significantly reduce the frequency and intensity of PTSD among drone pilots.[39] Moreover, with advances in technology, the collateral risks to civilians and noncombatants might actually be reduced if targeting is computerized.

One objection to AWS is premised on the principle of discrimination and assumes that lethal robots will kill with less discrimination than armed soldiers.[40] One of the opening scenes in the original RoboCop film illustrates an AWS that violates the discrimination principle. An armed security detail robot being showcased in a boardroom malfunctions, turns its turrets on a member of the corporate board of the company that has developed the technology, and kills him.[41] Human Rights Watch argues that a human—but not a robot—would avoid such errors of discrimination; that a human, for example, could discern the nonhostile intent of a child playing with a toy gun where a robot might not.[42]

Other applications of autonomous technology, however, suggest that AWS may do a better job than humans in discriminating between combatants and noncombatants, and of executing nonemotional adjustments to proportionality. The fleet of self-driving Google cars has thus far experienced fifteen collisions since going on the road in 2009, one involving injury to the car’s occupants, but with one exception all of the accidents were caused by other (human) drivers or the manual override of a human in the autonomous vehicle.[43] Robots already appear ready to drive cars much better and more safely than we humans do.

Driving a car involves a very different set of managed inputs and outputs than navigating a combat zone. The software and the hardware required to automate combat tasks will be correspondingly suited to the challenges and objectives of warfare. It may, in fact, prove much more difficult to design a reliable autonomous weapon than scientists now project. Ethics (even narrowly defined battlefield ethics) cannot be reduced to an algorithm, but more sophisticated software may outpace human capabilities.[44] The technology of AWS will never be error-free, but it may soon prove less error-prone than any human battalion.

The debate over AWS assumes that the technology will ultimately prove reliable, agile, and disturbingly lethal. Those scholars arguing against AWS suggest—like those who briefly outlawed balloonists from dropping bombs in the late nineteenth century—that the primary problem with AWS will be their effectiveness.[45] But so long as technological advances fall short of endowing autonomous weapons with the ability to compete with humans in achieving proportionality and discrimination, human intervenors will be preferred in terms of effectiveness; it is only when technology allows a steam drill to outpace a John Henry[46] that the steam drill will be widely adopted.[47] Robots, in fact, may perfect proportionality to the point where the rules of engagement are ratcheted up to require near-certainty of avoiding civilian deaths.[48]

The final argument against AWS is essentially an inversion of the emphasis on the benefits to be gained from diminishing direct involvement of soldiers in the horrors of combat: Some have argued that AWS will make war too easy.[49] Major DeSon, in his recent AWS article in The Air Force Law Review, recounts a 1967 episode of Star Trek where Captain Kirk encounters neighboring planets that have fought a centuries-long computerized war.[50] Kirk engineers a breach of a longstanding treaty that capped the loss of human life in any exchange. While the treaty successfully contained warfare, it also seems to have prolonged it. Captain Kirk explains that he has revived the horrors of total war so that the planets’ inhabitants have a motivation to avoid it. In the Star Trek episode, Kirk’s plan works; peace is achieved. But making war more psychologically costly for our soldiers in hopes of sowing the seeds of peace is a poor rationale. Despite the numerous insights of Captain James T. Kirk, his act of sabotage in this episode could just as well have resulted in the expansion, not the contraction, of war. It is not an act of humanitarianism to sidestep advances in technology that may spare soldiers some of the scars of battle.

Conclusion

In Forrest Gump, Lieutenant Dan loses both legs in the Vietnam War and spends most of the film coming to grips with his loss.[51] Combat took away from Lieutenant Dan part of his physical body as well as a part of his psyche; the debt he paid as a soldier was enormous in both physical and emotional terms. Death and injuries, both somatic and psychological, are part of the bargain for soldiers, but this does not mean that we should ignore an opportunity to permit the evolution of warfare in ways that will reduce the psychological costs of combat. AWS technology appears to offer this opportunity. The carnage caused by automated and autonomous weapons deployed by members of the military will undoubtedly still result in PTSD, for soldiers will still be tasked with the decision of whether to deploy AWS, a decision with deadly consequences. But PTSD manifestations ought to occur with less frequency and severity than if the same carnage were wrought more directly. We should not lose sight of the unique challenges and humanitarian concerns inherent in AWS, but neither should we disregard technology that might diminish psychic scars. If we have the ability to avoid turning future soldiers into “killing machines” and instead design machines for that task, the benefits to the soldiers themselves ought to be considered.

—–

*   Thomas E. Simmons is an assistant professor at the University of South Dakota School of Law.

[1].   Forrest Gump (Paramount Pictures 1994).

[2].   Charity History, Gary Sinise Foundation, https://www.garysinisefoundation.org/partners/charity-history (last visited Mar. 3, 2016).

[3].   Disabled Troops Inspire Gary Sinise to Give Back, CBS News (July 10, 2012), http://www.cbsnews.com/news/disabled-troops-inspire-gary-sinise-to-give-back/.

[4].   An AWS is “[a] weapon system that, once activated, can select and engage targets without further intervention by a human operator.” U.S. Dep’t of Def., Directive 3000.09, Autonomy in Weapon Systems 13 (Nov. 21, 2012), http://www.dtic.mil/whs/directives/corres/pdf/300009p.pdf.

[5].   James Dao, Drone Pilots Are Found to Get Stress Disorders Much as Those in Combat Do, N.Y. Times (Feb. 22, 2013), http://www.nytimes.com
/2013/02/23/us/drone-pilots-found-to-get-stress-disorders-much-as-those-in-combat-do.html?_r=0
.

[6].   See Gabi Siboni & Yoni Eshpar, Dilemmas in the Use of Autonomous Weapons, 16 Strategic Assessment 75, 75 (2014) (emphasizing cultural anxiety towards AWS on account of films like The Terminator); see also Jack M. Beard, Autonomous Weapons and Human Responsibilities, 45 Geo. J. Int’l L. 617, 623 (2014) (noting that while movies “often portray a future in which genuinely smart machines . . . fight wars” that “[i]t is unlikely . . . that military machines will soon (if ever) be able to ‘think’ like humans”).

[7].   Robert Jay Lifton, Home from the War 128–29 (1973).

[8].   Jeffrey Kirkwood, Introduction to Daryl S. Paulson & Stanley Krippner, Haunted by Combat, at xvi (2007).

[9].   American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders 271, 274 (5th ed. 2013) [hereinafter DSM-V].

[10].   Id. at 275.

[11].   Id. at 274.

[12].   Id.

[13].   Id.

[14].   Id. at 276.

[15].   See, e.g., Jacknife (Kings Road Entertainment 1989).

[16].   DSM-V, supra note 9, at 275.

[17].   The Deer Hunter (EMI Films 1978).

[18].   Id.

[19].   DSM-V, supra note 9, at 275.

[20].   See Dao, supra note 4.

[21].   See Phil Stewart, Overstretched Drone Pilots Face Stress Risk, Reuters (Dec. 18, 2011, 6:12 PM), http://www.reuters.com/article/2011/12/18/us-usa-drones-stress-idUSTRE7BH0VH20111218 (describing an Air Force survey of drone operators which found that 17% showed “signs of ‘clinical distress’”); see also Pratap Chatterjee, A Chilling New Post-traumatic Stress Disorder: Why Drone Pilots Are Quitting in Record Numbers, Salon (Mar. 6, 2015, 2:30 PM), http://www.salon.com/2015/03/06/a_chilling_new_post_traumatic_stress_disorder_why_drone_pilots_are_quitting_in_record_numbers_partner/ (suggesting that “a sense of dishonor in fighting from behind a screen thousands of miles from harm’s way” may be inducing a new kind of long-distance PTSD).

[22].   Lieutenant Colonel Matt J. Martin with Charles W. Sasser, Predator: The Remote-Control Air War over Iraq and Afghanistan: A Pilot’s Story 52 (2010).

[23].   See Chatterjee, supra note 21 (quoting Lieutenant Colonel Bruce Black: “A [drone] pilot has been watching his target[s], knows them intimately, knows where they are, and knows what’s around them”).

[24].   Gregory P. Noone & Diana C. Noone, The Debate Over Autonomous Weapons Systems, 47 Case W. Res. J. Int’l L. 25, 29 (2015).

[25].   Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts art. 1, opened for signature Dec. 12, 1977, 1125 U.N.T.S. 3, 7; see also Tyler D. Evans, Note, At War with the Robots: Autonomous Weapon Systems and the Martens Clause, 41 Hofstra L. Rev. 697, 700 (2013).

[26].   Charles J. Dunlap, Jr., A Tale of Two Judges: A Judge Advocate’s Reflections on Judge Gonzales’s Apologia, 42 Tex. Tech L. Rev. 893, 904 (2010) (quoting Richard C. Schragger, Cooler Heads: The Difference Between the President’s Lawyers and the Military’s, Slate (Sept. 20, 2006, 5:10 PM), http://www.slate.com/articles/news_and_politics/jurisprudence/2006/09/cooler_heads.html).

[27].   Stuart Russell, Take a Stand on AI Weapons, 521 Nature 415, 416 (2015).

[28].   Id.

[29].   Id.

[30].   See, e.g., Screamers (Triumph Films 1995) (based on the Philip K. Dick short story “Second Variety”) (portraying autonomous lethal weapons which, once deployed, operate indefinitely and self-replicate). Members of the army deploying the weapons wear broadcasting tags so as to avoid being targeted by their own weapons system. Id.

[31].   Battlestar Galactica (Glen A. Larson Productions 1978). Several television series by the same name were inspired by the original series.

[32].   Major Jason S. DeSon, Automating the Right Stuff? The Hidden Ramifications of Ensuring Autonomous Aerial Weapon Systems Comply with International Humanitarian Law, 72 Air Force L. Rev. 85, 94 (2015). AWS could also be constrained with “geographic, mission-specific limitations.” Benjamin Kastan, Autonomous Weapons Systems: A Coming Legal “Singularity”?, 2013 U. Ill. J.L. Tech. & Pol’y 45, 61.

[33].   DeSon, supra note 31.

[34].   See Statement to the United Nations on Weapons, International Committee of the Red Cross (Oct. 16, 2013), https://www.icrc.org/eng/resources/documents/statement/2013/united-nations-weapons-statement-2013-10-16.htm.

[35].   See Jack H. McCall, Jr., Infernal Machines and Hidden Death: International Law and Limits on the Indiscriminate Use of Land Mine Warfare, 24 Ga. J. Int’l & Comp. L. 229, 239 (1994) (noting that the U.N. protocol banning land mines was primarily directed at protecting civilian populations, although it also protects soldiers and peacekeepers by prohibiting “all mines or booby traps in the vicinity of U.N. peacekeeping troops”).

[36].   Russell, supra note 27.

[37].   Id.

[38].   This trade-off is purely hypothetical. At least under current technology, the idea of one million drones piloted by one million pilots is infeasible. See Chatterjee, supra note 21 (noting that a combat air patrol consists of three to four drones “and each takes as many as 180 staff members to fly them”) (internal citation omitted).

[39].   See Brett T. Litz & William E. Schlenger, PTSD in Service Members and New Veterans of the Iraq and Afghanistan Wars: A Bibliography and Critique, 20 PTSD Res. Q. 1, 2 (2009) (noting that “time in forward areas [and] witnessing others wounded or killed” increased the risk of PTSD among service members).

[40].   Christopher P. Toscano, “Friend of Humans”: An Argument for Developing Autonomous Weapons Systems, 8 J. Nat’l Security L. & Pol’y 189, 210–11 (2015).

[41].   RoboCop (Orion Pictures 1987).

[42].   Bonnie Docherty, Human Rights Watch, Losing Humanity: The Case Against Killer Robots 31–32 (2012), http://hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf [hereinafter Humanity]. “Technological fixes could not give fully [AWS] the ability to relate to and understand humans that is needed to pick up on such cues [as facial expressions and social context].” Id. at 32.

[43].   See Mark Prigg, Google Self-Driving Car is Involved in Its First Injury Accident: Tech Giant’s Autonomous Car is Rear-Ended Causing ‘Minor Whiplash’ to Three Employees, Daily Mail (July 16, 2015), http://www.dailymail.co.uk/sciencetech/article-3164675/A-self-driving-SMASH-Watch-Google-s-autonomous-car-rear-ended-firm-admits-drivers-hitting-surprisingly-often.html; Chris Ziegler, A Google Self-Driving Car Caused a Crash for the First Time, The Verge (Feb. 29, 2016 01:50 pm), http://www.theverge.com/2016/2/29/11134344/google-self-driving-car-crash-report.

[44].   See Ronald C. Arkin, Governing Lethal Behavior in Autonomous Robots 127–33 (2009) (outlining “ethical governor” software for artificial intelligence decision-makers).

[45].   See Charles A. Shanor & L. Lynn Hogue, Military Law in a Nutshell 260 (2d ed. 1996) (describing the Hague Balloon Declaration of 1899 which banned “the dropping of bombs from balloons for five years”).

[46].   See Katy Steinmetz, Top 10 Man-vs.-Machine Moments, Time (Feb. 15, 2011), http://content.time.com/time/specials/packages/article/0,28804,2049187
_2049195_2049267,00.html.

[47].   See Humanity, supra note 41, at 32–35 (arguing that AWS will not be able to achieve proportionate-force-deployment or military-necessity assessments). Human Rights Watch also argues—weakly—that AWS would violate the Martens Clause’s ban on weapons that violate “‘principles of humanity’ and the ‘dictates of public conscience’” because a majority of people will find killer robots “shocking and unacceptable.” Id. at 35.

[48].   DeSon, supra note 31, at 113. “[T]he pendulum of public sentiment both at home and abroad may be shifting to less toleration of any civilian casualties in war as advances in technology make weapons increasingly precise.” Id..

[49].   Marco Sassóli, Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to be Clarified, 90 Int’l L. Stud. 308, 315 (2014).

[50].   DeSon, supra note 32, at 107 (citing Star Trek: A Taste of Armageddon (NBC television broadcast Feb. 23, 1967)).

[51].   Forrest Gump, supra note 1; see also Winston Groom, Forrest Gump 155 (Washington Square Press 2002) (1986) (narrating the main character’s observations of Lieutenant Dan: “an his eyes is gleamin from behin his beard . . . he is the one needs some hep [sic]”).