Lethal autonomous weapon

Lethal autonomous weapons (LAWs) are a type of military robot designed to select and attack military targets (people, installations) without intervention by a human operator. LAW are also called 'lethal autonomous weapon systems' (LAWS), 'lethal autonomous robots' (LAR), 'robotic weapons' or 'killer robots'. LAWs may operate in the air, on land, on water, under water, or in space. The autonomy of current systems (early 2016) is restricted in the sense that a human gives the final command to attack - though there are exceptions with certain 'defensive' systems (see below).

A BAE Raven during flight testing

Explanation

LAWs should not be confused with Unmanned combat aerial vehicles (UCAV) or 'combat drones', which are currently remote-controlled by a pilot: only some LAWS are combat drones. Even those combat drones that can fly autonomously do not currently fire autonomously - they have 'a human in the loop'.

Some current examples of LAWs are automated ‘hardkill’ Active protection systems, such as a radar-guided gun to defend ships that have been in use since the 1970s (e.g. the US Phalanx CIWS). Such systems can autonomously identify and attack oncoming missiles, rockets, artillery fire, aircraft and surface vessels according to criteria set by the human operator. Similar systems exist for tanks, e.g. the Russian ‘Drozd’ (1977–82) now Arena (countermeasure), Trophy (countermeasure) or the German ‘AWiSS’/’AVePS’ (Active Vehicle Protection System) by Diehl, with a reaction time below 400ms. The main reason for not having a ‘human in the loop’ in these systems is sheer speed.

Systems with a higher degree of autonomy would include 'drones' or Unmanned combat aerial vehicles, e.g.: “The BAE Systems Taranis jet-propelled combat drone prototype can autonomously search, identify and locate enemies but can only engage with a target when authorized by mission command. It can also defend itself against enemy aircraft” (Heyns 2013, §45). The Northrop Grumman X-47B drone can take off and land on aircraft carriers (demonstrated in 2014); it is set to be developed into an ‘Unmanned Carrier-Launched Airborne Surveillance and Strike’ (UCLASS) system. It is now quite conceivable that an autonomous drone, perhaps with the size and appearance of a bird, could be commanded to locate (e.g. using cell phone signal), pursue and kill an individual person—rather like a ‘hit man’.

Future

While stationary systems and autonomous air systems are fairly advanced, autonomous mobile systems will shortly also be available on land, on water and under water, as well as in space. The 2013-2038 ‘Unmanned Systems Integrated Road Map’ of the US Department of Defense (US Department of Defense 2013) foresees increasing levels of autonomy in air/land/sea systems in the coming 25 years.

Ethical and Legal Issues

The possibility of LAWs has generated significant debate, especially about the risk of 'killer robots' roaming the earth - in the near or far future. There is a strong Campaign to Stop Killer Robots and, in July 2015, over 1,000 experts in artificial intelligence signed a letter warning of the threat of an arms race in military artificial intelligence and calling for a ban on autonomous weapons.[1]

Current US policy states: "Autonomous … weapons systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force."[2]

There is concern (e.g. Sharkey 2012) whether LAWs would violate International Humanitarian Law, especially the principle of distinction, which requires the ability to discriminate combatants from non-combatants, and the principle of proportionality, which requires that damage to civilians is proportional to the military aim. This concern is often invoked as a reason to ban 'killer robots' altogether - but it is doubtful that this concern can be an argument against LAWs that do not violate International Humanitarian Law.[3]

Other risks are that, just like Unmanned combat aerial vehicle#Laws and ethics of war, LAWs will make military action easier for some parties, and thus lead to more killings.

Finally, LAWs are said to blur the boundaries who is responsible for a particular killing - but this also is doubtful; in fact they may make it easier to record who gave which command.[4]

Literature

References

  1. Gibbs, Samuel (27 July 2015). "Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons". The Guardian. Retrieved 28 July 2015.
  2. US Department of Defense (2012). "Directive 3000.09, Autonomy in weapon systems" (PDF). p. 2.
  3. Müller, Vincent C. (2016). "Autonomous killer robots are probably good news". Ashgate.
  4. Simpson, Thomas W; Müller, Vincent C. (2016). "Just war and robots’ killings". Philosophical Quarterly.
This article is issued from Wikipedia - version of the Tuesday, April 26, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.