Commentary

Who Is Responsible for Lethal Autonomous Weapon Systems?

Requiring human responsibility for LAWS ensures countries maintain such weapons’ advantage of reducing non-combatant fatalities while minimizing international humanitarian laws violations.

From the Terminator to HAL, artificially intelligent “killer robots” have frightened and exhilarated audiences for years.

Recently, countries have turned this fiction into a reality through the development of lethal autonomous weapon systems (LAWS). One prominent example is the American Phalanx CIWS, a radar-guided 20 mm Vulcan cannon that automatically attacks threats approaching combat ships at sea.

As more countries develop intelligent weapons capable of mass destruction, it is imperative to create and enforce regulations for governing these LAWS.

Human Responsibility

While LAWS currently function under strictly defined parameters in their programming, the steady growth of artificial intelligence capabilities will lead to the creation of more powerful LAWS that can make decisions, learn from them, and potentially adjust their coding to react in new ways.

LAWS can identify, select, and engage targets without human intervention; they have the power to make life-and-death determinations.

To ensure LAWS comply with international humanitarian laws, humans, specifically commanders who deploy and programmers who create the weapons, should be required to accept responsibility for LAWS at all times.

Requiring human responsibility for LAWS ensures countries maintain such weapons’ advantage of reducing non-combatant fatalities while minimizing international humanitarian laws violations.

By requiring humans to accept responsibility instead of banning LAWS entirely, countries still maintain LAWS’ greater ability over human fighters to reduce civilian casualties and property damage.

Ethical Programming

Humans have historically proved to be poor at following ethical guidelines on the battlefield due to high stress levels, unclear orders, youthful troops, and more. Given these factors, expecting uncompromising adherence to international humanitarian laws is unreasonable and unattainable.

In contrast, LAWS’ unemotional responses, self-sacrificing abilities, and superior data-processing could significantly reduce non-combatant deaths. Engineers can program LAWS to act in ways humans deem most ethical.

Minimizing the atrocities of war, such as civilian deaths, is one of the highest priorities; therefore, although thorough consideration and research must be performed on LAWS before deploying them, their ability to reduce war atrocities is invaluable.

Accountability

Admittedly, some consider existing legal mechanisms unsuitable to address LAWS’ infractions, making it impossible to hold a party accountable. They deduce LAWS cannot be held accountable because they have no intentionality behind their decisions.

They also claim that human commanders and programmers could not be held accountable because of the fully autonomous nature of the machines; to be held accountable, the commanders and programmers must have had specific intention to commit criminal acts through misusing or tampering with the machine.

However, they fail to account for the specific nature of LAWS: to harm. Programmers chose to design and commanders chose to deploy a weapon, and their decision to build and use them in warfare makes them liable. In a situation where a LAWS malfunctions, they and the LAW are all agents and share accountability.

Although commander/programmer accountability may seem unfair in situations, this high level of responsibility will incentivize the use of LAWS only in specific, thoroughly-researched circumstances.


Headshot Penelope ShawPenelope Shaw is a second-year undergraduate at Lehigh University studying computer science and integrated business and engineering. She is interested in ethics of computing and artificial intelligence.


The views and opinions expressed here are those of the author and do not necessarily reflect the editorial position of The Defense Post.

The Defense Post aims to publish a wide range of high-quality opinion and analysis from a diverse array of people – do you want to send us yours? Click here to submit an op-ed.

Related Articles

Back to top button