The danger of autonomous drones is real. According to the United Nations Security Council report, last year, a military drone may have attacked humans autonomously in Libya for the first time. They have not transcended details of the operation or casualties, but it confirms that these lethal autonomous weapons are already being used on the battlefield despite international efforts to ban them.
A Turkish-made STM Kargu-2 drone would have been used to “remotely chase and engage” retreating soldiers, the report describes. The use of these drones in the Libyan conflict may open a new chapter for autonomous weapons. Tools programmed for AI to decide on its own to eliminate its targets, including humans.
Such are the Kargu-2, autonomous drones ready to kill
In June 2020, the Turkish army purchased nearly 500 Kargu drones from Defense Technologies Engineering and Trade Inc, better known as STM, a Turkish arms company. The first generation of these drones was introduced in 2017 , and in 2019 the Kargu-2 variant was shown, capable of carrying out swarm-mode attacks and operating autonomously.
Kargu can be translated into Turkish as “hawk.” The company explains that they are designed for “asymmetric warfare and counter-terrorism.” Weighing about 7 kg, the drone can stay in the air for at least 30 minutes and fly at a speed of about 145 km/h—improved numbers in the second generation.
The report of the United Nations Security Council describes the event as follows:
“Logistics convoys and retreating Haftar-affiliated forces were subsequently chased and attacked remotely by uncrewed aerial combat vehicles or lethal autonomous weapons systems such as the STM Kargu-2 and other marauding munitions. Lethal autonomous weapon systems were programmed to attack targets without requiring data connectivity between the operator and ammunition— indeed, a true ability to “shoot, forget, and find.” Uncrewed aerial combat vehicles and the small intelligence, surveillance, and drone reconnaissance capabilities of Haftar-affiliated forces were neutralized by electronic interference thanks to the Koral electronic warfare system.”
Evidence found by the United Nations indicates that STM Kargu-2s used ammunition and are operational in Libya. This is the first time they have detected their use, and their deployment is a breach of paragraph 9 of resolution 1970 (2011).
Kargu series drones can be operated manually but are also prepared to operate autonomously thanks to sensors ranging from electro-optic and infrared video cameras to a laser imaging system (lidar).
Through the KERKES program, STM prepared its drones to coordinate autonomously in a swarm and move following GPS. Along with 20 other drones, the Kargu-2s are programmed with an AI capable of selecting and attacking targets.
At the moment, no victims of one of these autonomous weapons have been reported (it would be the first), but in the demonstrative videos of STM it shows how these drones attack a group of mannequins.
How a self-contained drone decides who to attack
Anti-personnel mines can be configured to adapt sensitivity and only detonate with adults. With autonomous drones there is still a lot of debate about what specific parameters determine when they attack.
The machine learning of these drones is programmed with a large set of data and allows you to differentiate objects. Also vehicles such as tanks, buses or vans. It is assumed that different armies can train the AI of the drone to focus on the particularities of the enemy, giving it access to databases with images and information of those who wish to take it down.
However, just as we have seen how algorithms have many biases and are not without failures, these autonomous drones can also be wrong, with fatal consequences.
Experts and agencies call for banning these autonomous weapons before it is too late
The European Union, they have warned against these killer robots. The United States and Russia are also aware of the implications of these LAWs (Lethal Autonomous Weapons). However, both one power and another, supported mainly by South Korea, Israel, and Australia, have blocked the negotiation to ban them.
Since the end of 2013, the United Nations intends to ban them, but there has been no significant progress. During this time, personalities such as Brad Smith, President of Microsoft, Elon Musk, CEO of Tesla and DeepMind, and up to a total of 116 experts from 26 countries have requested that the development and use of autonomous killer robots be banned. Because “once this Pandora box opens, it will be difficult to close it.”
In Xataka | China warns: danger of accidental war over “smart weapons” is real and growing