Katana VentraIP

Lethal autonomous weapon

Lethal autonomous weapons (LAWs) are a type of autonomous military system that can independently search for and engage targets based on programmed constraints and descriptions. LAWs are also known as lethal autonomous weapon systems (LAWS), autonomous weapon systems (AWS), robotic weapons or killer robots. LAWs may operate in the air, on land, on water, underwater, or in space. The autonomy of current systems as of 2018 was restricted in the sense that a human gives the final command to attack—though there are exceptions with certain "defensive" systems.

"Killer robot" redirects here. For the concept of robots and/or artificial intelligence killing or eradicating humans and/or other living beings, see Existential risk from artificial general intelligence, AI takeover, and grey goo.

Being autonomous as a weapon[edit]

Being "autonomous" has different meanings in different fields of study. In terms of military weapon development, the identification of a weapon as autonomous is not as clear as in other areas.[1] The specific standard entailed in the concept of being autonomous can vary hugely between different scholars, nations and organizations.


Various people have many definitions of what constitutes a lethal autonomous weapon. The official United States Department of Defense Policy on Autonomy in Weapon Systems, defines an Autonomous Weapons Systems as, "A weapon system that, once activated, can select and engage targets without further intervention by a human operator."[2] Heather Roff, a writer for Case Western Reserve University School of Law, describes autonomous weapon systems as "armed weapons systems, capable of learning and adapting their 'functioning in response to changing circumstances in the environment in which [they are] deployed,' as well as capable of making firing decisions on their own."[3] This definition of autonomous weapon systems is a fairly high threshold compared to the definitions of scholars such as Peter Asaro and Mark Gubrud's definitions seen below.


Scholars such as Peter Asaro and Mark Gubrud are trying to set the threshold lower and judge more weapon systems as autonomous. They believe that any weapon system that is capable of releasing a lethal force without the operation, decision, or confirmation of a human supervisor can be deemed autonomous. According to Gubrud, a weapon system operating partially or wholly without human intervention is considered autonomous. He argues that a weapon system does not need to be able to make decisions completely by itself in order to be called autonomous. Instead, it should be treated as autonomous as long as it actively involves in one or multiple parts of the "preparation process", from finding the target to finally firing.[4][5]


Other organizations, however, are setting the standard of autonomous weapon system in a higher position. The British Ministry of Defence defines autonomous weapon systems as "systems that are capable of understanding higher level intent and direction. From this understanding and its perception of its environment, such a system is able to take appropriate action to bring about a desired state. It is capable of deciding a course of action, from a number of alternatives, without depending on human oversight and control - such human engagement with the system may still be present, though. While the overall activity of an autonomous unmanned aircraft will be predictable, individual actions may not be."[6]


As a result, the composition of a treaty between states requires a commonly accepted labeling of what exactly constitutes an autonomous weapon.[7]

Automatic defensive systems[edit]

The oldest automatically triggered lethal weapon is the land mine, used since at least the 1600s, and naval mines, used since at least the 1700s. Anti-personnel mines are banned in many countries by the 1997 Ottawa Treaty, not including the United States, Russia, and much of Asia and the Middle East.


Some current examples of LAWs are automated "hardkill" active protection systems, such as a radar-guided CIWS systems used to defend ships that have been in use since the 1970s (e.g., the US Phalanx CIWS). Such systems can autonomously identify and attack oncoming missiles, rockets, artillery fire, aircraft and surface vessels according to criteria set by the human operator. Similar systems exist for tanks, such as the Russian Arena, the Israeli Trophy, and the German AMAP-ADS. Several types of stationary sentry guns, which can fire at humans and vehicles, are used in South Korea and Israel. Many missile defence systems, such as Iron Dome, also have autonomous targeting capabilities.


The main reason for not having a "human in the loop" in these systems is the need for rapid response. They have generally been used to protect personnel and installations against incoming projectiles.

Autonomous offensive systems[edit]

According to The Economist, as technology advances, future applications of unmanned undersea vehicles might include mine clearance, mine-laying, anti-submarine sensor networking in contested waters, patrolling with active sonar, resupplying manned submarines, and becoming low-cost missile platforms.[8] In 2018, the U.S. Nuclear Posture Review alleged that Russia was developing a "new intercontinental, nuclear-armed, nuclear-powered, undersea autonomous torpedo" named "Status 6".[9]


The Russian Federation is actively developing artificially intelligent missiles,[10] drones,[11] unmanned vehicles, military robots and medic robots.[12][13][14][15]


Israeli Minister Ayoob Kara stated in 2017 that Israel is developing military robots, including ones as small as flies.[16]


In October 2018, Zeng Yi, a senior executive at the Chinese defense firm Norinco, gave a speech in which he said that "In future battlegrounds, there will be no people fighting", and that the use of lethal autonomous weapons in warfare is "inevitable".[17] In 2019, US Defense Secretary Mark Esper lashed out at China for selling drones capable of taking life with no human oversight.[18]


The British Army deployed new unmanned vehicles and military robots in 2019.[19]


The US Navy is developing "ghost" fleets of unmanned ships.[20]


In 2020 a Kargu 2 drone hunted down and attacked a human target in Libya, according to a report from the UN Security Council's Panel of Experts on Libya, published in March 2021. This may have been the first time an autonomous killer robot armed with lethal weaponry attacked human beings.[21][22]


In May 2021 Israel conducted an AI guided combat drone swarm attack in Gaza.[23]


Since then there have been numerous reports of swarms and other autonomous weapons systems being used on battlefields around the world.[24]


In addition, DARPA is working on making swarms of 250 autonomous lethal drones available to the American Military.[25]

Ethical and legal issues[edit]

Degree of human control[edit]

Three classifications of the degree of human control of autonomous weapon systems were laid out by Bonnie Docherty in a 2012 Human Rights Watch report.[26]

Artificial intelligence arms race

List of fictional military robots

Slaughterbots

The Guardian (2023) Video "How killer robots are changing modern warfare". 24.2.2023, Josh Toussaint-Strauss Ali Assaf Joseph Pierce Ryan Baxter.

[1]

Heyns, Christof (2013), , UN General Assembly, Human Rights Council, 23 (3), A/HRC/23/47.

‘Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions’

Krishnan, Armin (2009), (Aldershot: Ashgate)

Killer robots: Legality and ethicality of autonomous weapons

Müller, Vincent C. (2016), , in Ezio Di Nucci and Filippo Santoni de Sio (eds.), Drones and responsibility: Legal, philosophical and socio-technical perspectives on the use of remotely controlled weapons, 67-81 (London: Ashgate).

‘Autonomous killer robots are probably good news’

Saxon, Dan (2022). Fighting Machines: Autonomous Weapons and Human Dignity. University of Pennsylvania Press.  978-0-8122-9818-5.

ISBN

Sharkey, Noel E (2012), ‘’, Journal of Law, Information & Science, 21 (2).

Automating Warfare: lessons learned from the drones

Simpson, Thomas W and Müller, Vincent C. (2016), , The Philosophical Quarterly 66 (263), 302–22.

‘Just war and robots’ killings’

Singer, Peter (2009), Wired for war: The robotics revolution and conflict in the 21st Century (New York: Penguin)

US Department of Defense (2012), ‘’. <2014 Killer Robots Policy Paper Final.docx>.

Directive 3000.09, Autonomy in weapon systems

US Department of Defense (2013), ‘Unmanned Systems Integrated Road Map FY2013-2038’. <.

[1]

The Ethics of Autonomous Weapons Systems (2014) Archived 2020-10-25 at the Wayback Machine

Seminar at UPenn