By: Thibault Moulin
Doctrinal Background
Could a soldier be considered as a thing? At first sight, this idea might sound questionable–perhaps shocking. However, this is precisely one of the questions raised by human enhancement technologies, which started to make its own way in scholarship.
For instance, Chircop and Livoja suggest that a pharmaceutically-enhanced warfighter ‘does not become a mere instrument’ or a ‘weapon’, ‘as long as a warfighter possesses human agency [...] the ability to exercise free will’.[1] However, should this ‘human agency’ be ‘erased’, ‘turning the warfighter into a mere automaton or a remotely controlled biological avatar’, it would ‘be possible to speak of warfighters as weapons’.[2] They also think that ‘a robotically enhanced warfighter will constitute a means of warfare as soon as the warfighter is, to any extent, integrated with a weapon’.[3] Consequently, ‘[w]here the BMI [Brain-Machine Interfaces] does not use external sensors to record neural activity but relies on implanted electrodes, the integration of the warfighter with the weapon system is such that it is difficult to separate the two. In such circumstances, the warfighter becomes part of the weapon system, and thus a component of a means of warfare’.[4] However, the techniques used by BMIs are more diverse than the ‘recording’ of ‘neural activity’ through ‘external sensors’ (non-invasive) or ‘implanted electrodes’ (invasive). As a matter of fact, they also involve partially-invasive technologies, as well as neurostimulation. The variety of technologies and their potential for military applications need to be taken into account, to determine whether a soldier might be considered a ‘mean’ of warfare.
After having mentioned that ‘autonomous robots are clearly regulatable weapons’, Lin, Mehlman and Abney tackle the situation of a cyborg, ‘part-human, part-machine’.[5] They however mention that, ‘[i]f we want to say that robots are weapons but humans are not, then we would be challenged to identify the point on that spectrum at which the human becomes a robot or a weapon’.[6] They then suggest a ‘simpler solution’: ‘to say that humans are weapons’.[7] Harrison Dinniss and Kleffner consider ‘that the enhanced human soldier, per se, is not to be considered a weapon, because it is not the person that constitutes the offensive capability that can be applied to military objectives or enemy combatants’.[8] For the time being, they think that ‘a distinction between the human, on the one hand, and the enhancement technology, on the other, remains possible since the use of the technology does not convert the human into an object that could be considered a weapon’.[9] Noll affirm that, ‘[t]o the extent that neurotechnology informs future weapons systems, states are under an obligation to consider how the resulting weapons system would relate to IHL obligations’.[10] According to Beard, Galliott and Lynch, ‘[i]f a weapon is deployed in violation of international law, intuition suggests that the person wielding it will be held responsible’.[11] However, ‘this analogy may not extend to enhanced warfighters, who are simultaneously weapon and wielder’.[12] These scholars thus go one step further: they contemplate the possibility for a human being (a warfighter) to be considered as a weapon.
Objective
This project proposes to revisit these questions, from the perspective of article 36 of the Protocol Additional I to the Geneva Conventions, and with a specific focus on brain-machine interfaces. According to article 36, ‘[i]n the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party’.
Structure and Methods
In its introduction, this paper will identify the different types of BCIs and their functioning. The concept of this technology is the following: electrodes are either placed on the head (non-invasive), on the skull (partially invasive), or arrays implanted in the brain (invasive). Then, this device ‘translates neuronal information into commands capable of controlling external software or hardware such as a computer or robotic arm’.[13] The status of their military applications will also be investigated, as well as the projects that are supposed to be developed in the future. For instance, in September 2018, DARPA ‘confirmed that an individual equipped with an experimental brain-computer interface […] was able to successfully command and control multiple simulated jet aircraft’.[14] To proceed, the research will focus on medical and bioethics reviews, official publications, as well as a sample of concrete clinical trials.
It is then necessary to study whether a qualification of either the enhancing technology or the enhanced soldier as ‘weapons’ (I) ‘means’ (II) or ‘methods’ of warfare (III) is possible.
[1] Luke Chircop and Rain Livoja, ‘Are Enhanced Warfighters Weapons, Means, or Methods of Warfare’ (2018) 94 ILS 161, 177.
[2] Ibid.
[3] Ibid 180.
[4] Ibid.
[5] Patrick Lin, Maxwell Mehlman and Keith Abney, ‘Enhanced Warfighters: Risk, Ethics, and Policy’ (2013) 29.
[6] Ibid.
[7] Ibid 29-30.
[8] Heather Harrison Dinniss and Jann Kleffner, ‘Soldier 2.0: Military Human Enhancement and International Law’ (2016) 92 ILS 432, 438.
[9] Ibid.
[10] Gregor Noll, ‘Weaponising neurotechnology: international humanitarian law and the loss of language’ (2014) 2(2) London Review of International Law 201, 211.
[11] Matthew Beard, Jai Galliott and Sandra Lynch, 'Soldier Enhancement: Ethical Risks and Opportunities' (2016) 13(1) Australian Army Journal 5, 15.
[12] Ibid.
[13] ‘Brain-Machine Interface (Nature) <www.nature.com/subjects/brain-machine-interface>
[14] William Kucinski, 'DARPA subject controls multiple simulated aircraft with brain-computer interface' (SAE, 12 September 2018) <www.sae.org/news/2018/09/darpa-subject-controls-multiple-simulated-aircr...
[15] Bundesministerium für Verteidigung, Prüfung neuer Waffen, Mittel und Methoden der Kriegführung (2016) 37, [301].
[16] Department of the Air Force, ‘The Law of War’ (03.08.2018) Air Force Instructions 51-401, 13 <https://static.e-publishing.af.mil/production/1/af_ja/publication/afi51-...
[17] DOD, ‘Dictionary of Military and Associated Terms’ (2018 252 <www.jcs.mil/Portals/36/Documents/Doctrine/pubs/dictionary.pdf>
[18] Bundesministerium für Verteidigung, Prüfung neuer Waffen, Mittel und Methoden der Kriegführung (2016) 37, [301].
[19] ICRC, ‘A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977’ (2006) 932.
[20] Ministère de la Défense, Manuel du droit des conflits armés (Ministère de la Défense 2012) 64.
[21] MOD, The Joint Service Manual of the Law of Armed Conflict (DSDC(L) 2004) 82 [5.32.4].
[22] Commonwealth of Australia, Law of armed conflict (Defence Publishing Service 2006) 6-7.
[23] DOD, 'Law of War Manual' (2015) 185
<https://dod.defense.gov/Portals/1/Documents/pubs/DoD%20Law%20of%20War%20...
[24] ICRC, ‘A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977’ (2006) 932.
[25] Bundesministerium für Verteidigung, Prüfung neuer Waffen, Mittel und Methoden der Kriegführung (2016) 37, [301].
[26] William Boothby, New Technologies and the Law in War and Peace (CUP 2018) 384.
[27] ‘Statement on Lethal Autonomous Weapons Systems (LAWS)’ (2016) <www.unog.ch/80256EDD006B8954/(httpAssets)/5951D4CF7936ADE3C1257F9A004B62D6/$file/2016_LAWS_+MX_ChallengestoIHL_Statements_Israel.pdf>