The Roundtable
Welcome to the Roundtable, a forum for incisive commentary and analysis
on cases and developments in law and the legal system.
on cases and developments in law and the legal system.
By Ishita Chakrabarty Ishita Chakraborty is a guest writer for the Penn Undergraduate Law Journal’s Roundtable. The first cross-boundary targeting through drones (targeting specific individuals and facilities in a different state through remote systems being operated by persons in another state) was conducted by the US in Afghanistan in 2001. Since then, drones have been deployed in direct combat operations, despite public outcry about the “imprecise nature” of targeting. An independent report published in 2017 noted how US-led coalition strikes have killed approximately 4000 civilians, while the US Central Command claims that civilian casualties have not been more than 484 [1]. Another report indicates US-led drone strikes in one case killed 150 civilians after repeatedly bombing a school in Syria [2]. In the year 2013, the UN Special Rapporteur had called for a halt on the development of “killer robots” (triggered automatically by the target without the user’s intervention), which he said had zero considerations for human dignity. Robots could never have a final say over matters of life and death [3]. The movement again gained traction, sometime around August, 2018, when 26 countries explicitly stood up to put a stop to the development of Lethal Autonomous Weapons, and more than 70 countries met at the UN to discuss the challenges involved in outlawing autonomous weapons [4]. The use of autonomous weapons violates the fundamental principles of jus in bello (laws of war) distinction, proportionality and precaution. The first mandates that attacks must always be directed against “military objectives” and not “civilians and civilian objects.” The second mandates that the intensity of attacks should be proportionate to the military advantage sought over the other party. Finally, the third mandates that those undertaking the attack should do everything feasible to verify that the object is a military object and the attack should be aborted when the status of the object appears uncertain [5].
One report noted how strikes through autonomous weapons have to be carried out several times until specific individuals can be targeted, resulting in more casualties than states usually claim, rebutting assertions of “preciseness” [6]. During a conference on the subject of intelligent defense systems, the same question on the legality of attacks, arose in the context of the Israeli “Iron-Dome” (a mobile missile defense system), and how its use completely ignored the military-civilian dichotomy. Another sentry system that Israel employs along its border with Gaza, “Roeh-Yoreh,” also comprises of prepositioned weapons that sees and shoots [7]. There have been instances of “signature strikes,” where individuals bearing specific characteristics associated with terrorist activities have been targeted through autonomous systems, without actually knowing their identities [8]. The US, for instance, looks at whether a person has been previously seen “giving out orders” or possesses arms or exhibits characteristics that believes is to be equated with a person’s affiliation to a terrorist group [9]. Since these systems are tasked with the selection and fire of targets, it is difficult for the users to determine particular targets at the exact moment the firing is launched. Autonomous weapons operate in the context of fixed standards. The sensor data is processed with the help of algorithms that enable the detection, tracking and classification of objects, by tallying the data with existing databases. So, any changes within these fixed environments will naturally lead to unpredictability in its functioning. To give an example, International Humanitarian Law (IHL), comprising primarily of the Geneva and the Hague Conventions, along with customary law followed by States and subsequent treaties, specifically provides that civilians and those rendered hors de combat (who because of reasons of injury or by choice no longer participate in the conflict) are to be exempted from attack at all times. This is a part of customary international law, a de minimis guarantee (the minimal protections that states are under an obligation to provide) offered under Common Article 3 of the Geneva Conventions. Since the attacks can no longer be abrogated through human intervention once launched, autonomous weapons fail to take into account behavioral standards, such as the exact moment a person might decide on giving up their belligerent status. This would abrogate the standards of constant care and human diligence required under IHL. Peter Marguiles, an expert in National Security laws and Professor at the Roger Williams University School of Law), terms this as an exercise of “dynamic diligence” that is an obligation under the precautionary principle of jus in bello [10]. IHL prohibits the use of landmines, booby-traps, and munitions in the context of warfare since they cause unnecessary suffering and superfluous injuries. The thread that runs common to all these prohibited weapons is their functioning— they are set off by the targets and not by the users, providing no option for termination of attack. Autonomous weapons function in the same exact way. Shouldn’t they also be prohibited by the same reasoning? Another point raised against the employment of autonomous weapons is that, with the development of complex technology and the asymmetrical nature of warfare (tilted in favor of the developing states), one party to the conflict can conduct cyber-attacks targeting the sensor systems within the autonomous weapons of the other party. Similarly, it is possible that because of technical malfunctioning within the sensor systems, targeting decisions of a party can be skewed. In the absence of any mission control software to detect such aberrations, catastrophic attacks can be inadvertently launched. In the Advisory Opinion on the Construction of the Wall, which concerns the wall in Palestine, the International Court of Justice (ICJ) opined that the occupying states in the context of an international armed conflict or the states within which internal armed conflicts take place, are under an obligation, to not digress from applying fundamental principles of human rights in the context of armed conflict situations. Therefore, states cannot arbitrarily take away the lives of individuals [11]. The sStates must fulfill their obligations by abstaining from employing such weapons, and even in the context of targeting decisions, they must undertake a two- way decision, as the Human Rights Committee has previously noted. First, by using data recovered from the remote systems, and second, by an on-ground direct verification process [12]. In fact, whenever possible, conventional methods of warfare should be preferred over the use of autonomous weapons, even if not fully autonomous. To conclude, the recent popular usage of autonomous weapons are unlikely to take into account the nuances and rules of IHL. In addition, there is no assurance that specific individuals, such as terrorists, will be eliminated because they often relocate to safer havens, while the civilians bear the brunt of the attacks. Since states cannot act illegally, even in situations of armed conflicts (and as a corollary must observe fundamental human rights principles), states should refrain from deploying and further developing lethal autonomous systems. References: [1] Daniel R Mahanty, Don’t loosen the rules on civilian casualties during drone strikes, (Nov. 19, 8:00 pm), https://www.defenseone.com/ideas/2017/06/dont-loosen-rules-civilian-casualties-during-drone-strikes/138933/. [2] Margaret Sullivan, Middle East civilian deaths have soared under Trump. And the media mostly shrug., The Washington Post, Mar. 18, 2018, https://www.washingtonpost.com/lifestyle/style/middle-east-civilian-deaths-have-soared-under-trump-and-the-media-mostly-shrug/2018/03/16/fc344968-2932-11e8-874b-d517e912f125_story.html?utm_term=.de10e512604a. [3] Christof Heynes, Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, (Nov. 13, 2018, 11:30 am), https://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf, paras. 118-126. [4] Use of ‘Killer Robots’ in wars would breach law, say campaigners, The Guardian, Aug. 21, 2018, https://www.theguardian.com/science/2018/aug/21/use-of-killer-robots-in-wars-would-breach-law-say-campaigners. [5] Additional to the Geneva Conventions of 12 August 1949 and relating to the Protection of Victims of International Armed Conflicts (adopted 8 June 1977) 1125 UNTS 3 (Protocol I) art. 48, 51, 52, 57, 58. [6] 41 men targeted but 1147 people killed: US Drone Strikes- the facts on the ground, The Guardian, Nov. 24, 2014, https://www.theguardian.com/us-news/2014/nov/24/-sp-us-drone-strikes-kill-1147. [7] Michael Mertes, What are Military Goals?, (Nov. 13, 2018, 2:30 pm) https://www.kas.de/veranstaltungsberichte/detail/-/content/was-sind-militaerische-ziele-. [8] Kevin Jon Heller, One Hell of a Killing Machine: Signature Strikes and International Law, 11 Journal of International Criminal Justice (2013). [9] US to continue ‘signature strikes’ on people suspected of terrorist links, The Guardian, July 1, 2016, https://www.theguardian.com/us-news/2016/jul/01/obama-continue-signature-strikes-drones-civilian-deaths. [10] Peter Margulies, Making Autonomous Weapons Accountable: Command Responsibility for Computer-Guided Lethal Force in Armed Conflicts (Roger Williams Univ. Legal Studies Paper No. 166, 2016), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2734900&download=yes. [11] Legal Consequences of the Construction of a Wall in the Occupied Palestinian Territory, Advisory Opinion, July 9, 2004, https://www.icj-cij.org/files/case-related/131/131-20040709-ADV-01-00-EN.pdf. [12] International Humanitarian Law and the Changing Technology of War 89 (Dan Saxon ed., Brill Nijhoff 2013). Photo Credit: Wikimedia Commons: USAF Photographic Archives https://commons.wikimedia.org/wiki/File:MQ-9_Reaper_UAV.jpg The opinions and views expressed through this publication are the opinions of the designated authors and do not reflect the opinions or views of the Penn Undergraduate Law Journal, our staff, or our clients.
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
Archives
November 2024
|