Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

A Military Drone With A Mind Of Its Own Was Used In Combat, U.N. Says

A Kargu rotary-wing attack drone loitering munition system manufactured by the STM defense company of Turkey. A U.N. report says the weapons system was used in Libya in March 2020.
Emre Cavdar
/
STM
A Kargu rotary-wing attack drone loitering munition system manufactured by the STM defense company of Turkey. A U.N. report says the weapons system was used in Libya in March 2020.

Military-grade autonomous drones can fly themselves to a specific location, pick their own targets and kill without the assistance of a remote human operator. Such weapons are known to be in development, but until recently there were no reported cases of autonomous drones killing fighters on the battlefield.

Now, a United Nations report about a March 2020 skirmish in the military conflict in Libya says such a drone, known as a lethal autonomous weapons system — or LAWS — has made its wartime debut. But the report does not say explicitly that the LAWS killed anyone.

"If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill," Zachary Kallenborn wrote in Bulletin of the Atomic Scientists.

The assault came during fighting between the U.N.-recognized Government of National Accord and forces aligned with Gen. Khalifa Haftar, according to the report by the U.N. Panel of Experts on Libya.

"Logistics convoys and retreating [Haftar-affiliated forces] were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 ... and other loitering munitions," the panel wrote.

The Kargu-2 is an attack drone made by the Turkish company STM that can be operated both autonomously and manually and that purports to use "machine learning" and "real-time image processing" against its targets.

The U.N. report goes on: "The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true 'fire, forget and find' capability."

"Fire, forget and find" refers to a weapon that once fired can guide itself to its target.

The idea of a "killer robot" has moved from fantasy to reality

Drone warfare itself is not new. For years, military forces and rebel groups have used remote-controlled aircraft to carry out reconnaissance, target infrastructure and attack people. The U.S. in particular has used drones extensively to kill militants and destroy physical targets.

Azerbaijan used armed drones to gain a major advantage over Armenia in recent fighting for control of the Nagorno-Karabakh region. Just last month, the Israel Defense Forces reportedly used drones to drop tear gas on protesters in the occupied West Bank, while Hamas launched loitering munitions — so-called kamikaze drones — into Israel.

What's new about the incident in Libya, if confirmed, is that the drone that was used had the capacity to operate autonomously, which means there is no human controlling it, essentially a "killer robot," formerly the stuff of science fiction.

Not all in the world of security are concerned.

"I must admit, I am still unclear on why this is the news that has gotten so much traction," Ulrike Franke, a senior policy fellow at the European Council on Foreign Relations, wrote on Twitter.

Franke noted that loitering munitions have been used in combat for "a while" and questioned whether the autonomous weapon used in Libya actually caused any casualties.

Jack McDonald, a lecturer in war studies at King's College London, noted that the U.N. report did not make clear whether the Kargu-2 was operating autonomously or manually at the time of the attack.

While this incident may or may not represent the first battlefield killing by an autonomous drone, the idea of such a weapon is disquieting to many.

A global survey commissioned by the Campaign to Stop Killer Robots last year found that a majority of respondents — 62% — said they opposed the use of lethal autonomous weapons systems.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Joe Hernandez
[Copyright 2024 NPR]