Apr 04, 14
[On April 16th and 17th I'll help lead a discussion with West Points cadets in their Just War Theory class. The text below is a preparatory reading assignment, in addition to a short article by Zenko & Kreps from a recent Foreign Affairs.]
Drone Warfare Discussion
West Point, 4/16 and 4/17
Mike Gutierrez, Loyola University Chicago
“Wo aber Gefahr ist, wächst
Das Rettende auch”
“But where danger is, grows/ The saving power also.” In this excerpt from the hymn “Patmos” by the German poet Friedrich Hölderlin we encounter the pairing of danger [die Gefahr] and saving power [das Rettende]. The poem gives priority to danger, first, and the saving power that might saves us from the danger, second. For our own purposes we will reverse this priority in order to think from the perspective of those who are charged with the responsibility of safeguarding a people, particularly in the sense of a military charge. From this perspective we are apt to consider what is best for the goal of safeguarding, as the first and highest priority, and then consider what danger lies therein, second. Such a scheme of priority seems to be a necessity for a soldier, but not because reflecting on the dangers inherent in certain strategies of safeguarding is unimportant. Only in such reflection will we understand the meaning and extent of the ‘safe’ in ‘safeguarding.’
I offer here a brief sketch of one such danger that manifests itself in the widespread use of drones as a technology of war, and, more generally, in the increased automation of military technology as such. My thesis (which is really more the raising of a question) is that the use of automated technologies in warfare covertly reframes our experience of war such that the ethical, social, and political horizon of our possibilities becomes narrowed in unacceptable ways. At the end of the paper I will suggest different paths for exploring how this reframing operates.
(1) Is there a danger?
We might ask ourselves, however, if any such danger really presents itself in the use of drones. Or a danger that is, at any rate, novel and unfamiliar to thinkers that have reflected on the challenges of war, ethical or otherwise, over the centuries. Some individuals are quite reticent to acknowledge that drone warfare is a remarkable development. From a recent Foreign Affairs article, two authors mark down the following:
“In a speech last November, Thomas Lawson, Canada’s chief of the defense staff, equated missiles fired from a drone with those fired from a piloted aircraft, because they both reach their target as intended. This view -- that drones represent not a paradigm shift but just a different way for states to do what they have done for decades -- has become widespread. As Norton Schwartz, then chief of staff of the U.S Air Force, said in May 2012, “If it is [a legitimate target], then I would argue that the manner in which you engage that target, whether it be close combat or remotely, is not a terribly relevant question.” But that view ignores how drones create a particular moral hazard by keeping pilots away from danger. Because the costs of launching deadly strikes with drones are lower than with piloted aircraft, civilian officials are more willing to authorize them.”
The authors (Kreps and Zenko, 2014) keep open the possibility that drone warfare does, indeed, present a paradigm shift. They do so by attacking the unstated presupposition that all decisions in war are made with reference only to accomplishing the object, regardless of economic of casualty considerations. I’d like to keep open the same possibility, and further argue that the paradigm shift conceals a danger. But I’ll pursue a more philosophical line of approach by attacking another unstated presupposition of Lawson and Schwartz: namely, the presupposition that the technology of drone warfare is, in its essence, instrumental, merely a means to an end.
(2) Is the technology of simply a tool or technique?
We can proceed along this course by way of three questions:
(i) What does it mean for technology to be instrumental?
(ii) And, what do we mean when we say that the technology of drone warfare is ‘merely’ instrumental?
(iii) And, what do we mean by denying that modern technology is essentially instrumental?
First, what does it mean for technology to be instrumental? In the simplest terms, to be instrumental is to be a means to an end. We can think of an instrument as a tool: a hammer is for hammering nails; a ruler is for measuring; and so on. We can also think of an instrument as a technique. In fact, this is the more originary meaning, stemming from the Greek noun technē, which means the art or craft appropriate to an artisan or craftsman. The technology of drone warfare can be interpreted as instrumental in both senses: as tool for getting a job done, and more generally as a technique for opening up a range of functionality not accessible by other means.
Second, what does it mean to say that technology in general (and drones in particular) is ‘merely’ instrumental? To make such an assertion is to argue that the essence of technology is restricted to its instrumentality. Lawson and Schwartz make such an assertion, primarily in the first ‘tool’ sense: to hear them describe the process of target selection, one would think that it proceeds independently of the process of technique selection. That is, the target is selected; then, and only then, is the appropriate means for striking the target selected. This interpretation holds some initial plausibility insofar as the division of labor within the military, combined with the hierarchical distribution of decision-making along the chain of command, means that military actors may know what or who the target is before they know by what means they are supposed to strike it.
However, when we consider the decision procedure for target selection as a whole, the ‘tool’ sense of instrumentality would never suffice. Why? Because it would make no strategic sense for targets to be selected irrespective of the practical costs and benefits associated with striking the target . We have to appeal to the second ‘technique’ sense of instrumentality to understand how higher level decisions concerning the use of the technology of drones (and the technology of warfare in general) are made. Different techniques of warfare open up different ranges of practical possibilities, each with their own set of moral, social, economic, and so on, costs and benefits. The authors Kreps and Zenko gesture toward these different horizons of possibility by highlighting the moral hazards harbored by technology of warfare that is cheap and apparently successful at keeping soldiers out of harm’s way. New techniques may offer new benefits, but their full value can only be assessed when the new costs are taken into account.
These two senses of instrumentality -- the ‘tool’ sense and the ‘technique’ sense -- do the majority of conceptual work in regard to current, mainstream thinking on the technology of war. Where the ‘tool’ sense falls short, the ‘technique’ sense can take up the slack. And vice versa. Is there any reason to think that modern technology presents us with any problems that can’t be grasped with one, or the other, sense of instrumentality?
Third, what does it mean to deny that modern technology is essentially instrumental? It means that we deny that modern technology can be understood (and its risks assessed) through an appeal to one or the other sense of instrumentality. In his influential essay “The Question Concerning Technology” the German philosopher Martin Heidegger pushes back against the instrumental definition of technology by appealing to the more primitive aspect of technē as a creative ‘poeien’ (doing, making) instead of a utilitarian ‘prattein’ (doing, using). More basic than the tool, which is lost without its human user; more basic than the technique, with which humans can manipulate various tools to various ends; more basic than both of these is the original thinking that takes into account the way in which techniques as such order and arrange the how we see and engage with the world and everything in it. Heidegger has a special term for the way in which technology arranges and orders the world, which is, only with difficulty, translated as “enframing” [Ge-stell].
With this term he means to remind us that at its most basic level the human relationship to technology is not a relationship to this or that means to this or that end, but rather a creative collaboration, if you will, by which we participate in the determination and celebration of certain types of values, social, political, ethical and otherwise. In more concrete terms, what is most essential to the technology of drone warfare is not its utilitarian value for striking this or that target; or its technical value as one method among an ensemble of methods to be used for military purposes -- on the contrary, what is most essential to the technology of drone warfare is the novel way in which it “enframes” both enemy and ally, inserting a peculiar kind of distance between the two that is negotiated by yet further layers of technologically-mediated information.
(3) Technology “enframes” the world -- so what?
Is this all too obvious and trivial? Every technical perspective on the world “enframes” the world in a certain peculiar way. The farmer’s plow “enframes” the earth as to-be-tilled soil, the baker’s oven “enframes” the dough as to-be-baked, the centurion’s shield and spear “enframes” the enemy as to-be-warded-against and -speared in close quarters. We might think of drones in a similar manner. However, modern technology presents us with a particularly pernicious style of enframing insofar as mechanical and digital advances allow the full extent and meaning of enframing to conceal itself. The drone operator, for example, orients himself or herself to the target in a manner that superficially resembles the manner in which a sniper orients himself or herself to a target. The drone operator takes a similar type of responsibility for the drone under his or her control that the sniper might take for the gun under his or her control. And yet sandwiched between the drone operator and the target are layers of technologically mediated information that the operator cannot take responsibility for. Some of this information is provided by the sensory apparatus of the drone itself and related networks: location coordinates, heat signatures, weather data, etc. And some of this information comes from human intelligence, which is subsequently processed and put at the disposal of the operator. The technology of drone warfare, as tool and technique, folds these separate strands of information together with the net effect of “distributing” the responsibility of execution across a range of actors not limited to the operator and the operator’s superior. The analogy between the farmer and his plow, the baker and his bread, even the centurion and his spear and sword breaks down. The farmer, or baker, or even the centurion, exhibits a peculiarly direct type of control over his or her tools and techniques; the drone operator less so. The analogy of the assembly line worker who adds one part to a complex product may be more accurate insofar as it captures the individual’s mediated role in the act of production.
What’s critical to recognize is that the “distribution” of responsibility is fundamentally different from the normal “distribution” of responsibility that we are used to seeing in hierarchical systems of authority like the military’s chain of command, in which orders come from above and are executed below, making each superior and subordinate co-responsible in the fulfillment of objectives. On the contrary, the intervention of technologically automated processes in the decision-making circuit represents a new third element that reorients our understanding of the field of combat. It is incumbent on those who hold out the prospect of justice in warfare to recognize this reorientation, and to question whether or not the new field of battle, which we create, condone, and sustain through the elective use of certain technologies, accords with our moral and ethical sensibilities.
(4) Conclusion: The Dangers Therein
Undoubtedly the technology of drone warfare offers new possibilities by which the military might safeguard the people, interests, and values which it is charged with safeguarding. And yet with this great “saving power” also lies danger. It’s not unheard of that a powerful technology of war, as tool or technique, is ruled out of use on the basis of its potential for misuse. Chemical and biological weapons have been ruled out because the kind of battlefield that their use creates is judged to be discordant with the values of a just war. And, in a more pragmatic sense, they are judged to be practically dangerous -- the ‘safety’ of their safeguarding is utterly insufficient. Could drone warfare go the same way? I list four points below which we should examine to ask ourselves whether the world “enframed” by the technology of drone warfare is one that embodies our moral values.
(a) The question of space and spatiality. Automated technologies of war have no strategic central orientation in a spatial sense. Drones, for example, are not primarily used to protect the homeland in a defensive posture -- rather, they strike as an advance front. Note also the use of drones to subvert spatial norms (boundaries) of the sovereignty of sovereign nations. Does drone technology artificially enhance the attractiveness of initiating conflict beyond our national borders?
(b) The question of time and temporality. Automated technologies have no center of orientation in a temporal sense. Drones, for example, are not primarily used as a defensive response to an offensive attack (note the great reliance on 9/11 as a temporal zero point of an unfolding narrative that is increasingly unable to account for and justify subsequent military decisions in Afghanistan and Iraq and beyond). Note also how the use of drones accompanies the policy of preemption, which is itself a subversion of “temporal sovereignty” of a sovereign nation (not yet) at war. Does drone technology artificially enhance the attractiveness of preemption?
(c) The question of ethical agency: is ‘granular responsibility’ responsibility at all? The “bottom up” question: does the soldier, who relies on technology to accomplish the objective, sacrifice his moral agency in greater or lesser degree relative to the number of layers mediating his action in war?
(d) The question of policy: does the increasing automation of warfare limit in advance the upper echelon’s ability to assess the moral and military efficacy of its methods and techniques? The “top down” question: do the layers of embedded technology implicit in war today, which strategy makers have only a fleeting understanding of, inhibit an adequately clear grasp of the ways in which that technology “enframes” the battlefield.