Touch, Decision, Vision – Practices of Remote Control in Warfare

  • I co-organized the panel „Touch, Decision, Vision – Practices of Remote Control in Warfare“ together with Kathrin Friedrich and Nina Franz
  • International Conference Control, 10th European Society for Literature, Science and the Arts Conference, Stockholm, 14.-17.6.2016

Abstract of the panel: Surgical interventions and military operations have become significantly depended on the separation of bodies through technology. Robotic surgery or drone warfare have shown that human actors increasingly rely on remote interaction. While the practice of controlling distance through technology has different motivations, such as touchless interaction to maintain sterility or remotely controlled weapons to reduce vulnerability, the underlying question in medicine and warfare is the same: if action and perception are mediated by technical devices to what extend do the technologies of remote control shape, impact and govern medical and military decisions and actions? The presentations in this panel will explore three specific modalities of remote control, namely touch, decision and vision in military and surgical contexts. By drawing on historical examples as well as on current problems and challenges of remote control, the papers investigate their fundamental principles for interaction and intervention.

My talk focused on the question of Vision and how senses and sensors in remote warfare are being controlled by whom. Abstract of my talk: Image-guided military operations embed soldiers into a complex system of image production, transmission and perception, that on the one hand separate their bodies from the battlefield, but on the other hand mediates between them. Especially in remotely controlled operations of so-called unmanned aerial systems (UAS) navigation and intervention require the synchronization between human and technical actors in real-time as the knowledge of a situation, the situational awareness, relies almost exclusively on the visualization of sensor data. This corresponds to a new operative modality of images which differs from previous forms of real-time imaging such as live broadcasting, as it is based on a feedback-loop that turns the beholder into an actor. Images are not simply analysed and interpreted but become agents in a socio-technological assemblage. The paper will draw upon this functional shift of images from a medium of visibility and visualization towards a medium that controls and is used to control operative processes. Based on the analysis of vision, architecture and navigation in remote warfare it will discuss how real-time video technology and the mobilization of sensor and transmission technology have produced a type of intervention, in which interaction is increasingly organized and determined by imaging technology. On the basis of a case study about UAS sensor operators, the paper will show that action and perception in remote warfare relies fundamentally upon a form of iconic practice, in which sensors can hardly be described as a prosthesis or extension of the eye that are simply used by an operator. It contradicts the popular view that UAS-Crews possess an omnipresent and pervasive technical eye, in order to kill with surgical precision from thousands of kilometres away (Brennan, 2012). Instead UAS crews compare visuality in remote operations with looking through a soda straw (Cullen, 2011). As a hybrid form of iconic practice, the concept of vision and visualization in remote controlled warfare suggests a shift in conceptions of agency and autonomy and may be more accurately described as a convergence between soldier and weapon (Suchman, 2015). By analysing sensor operator’s applied iconic knowledge the aim of the paper is to investigate how the production of visibility in UAS operations strategically dissolves the borders between senses and sensors in order to show how this affects actions and decisions in remote controlled warfare.