Which armies have combat robots? An overview!

Which armies have combat robots? An overview!

What is a combat robot? Who uses them? Under what conditions?

Recently, several media outlets reported that China has replaced its soldiers with combat robots in Tibet on the border with India. Medien

In the Chinese People’s Army, there are more and more forms of deployment in which soldiers are replaced by machines. Experts estimate a share of 50 percent by killer robots in combat in the near future. In Tibet, it was the altitude of the mountains that made this necessary. The thin icy air made it difficult for the soldiers. Modern warfare is increasingly planning for the use of so-called “combat robots.” But what is it?

What is a combat robot and what types are there?

To define the term: people often talk about “military robots”, but this term is too vague. One would limit the use of combat robots to the military. But the term “military” is subject to clear definitional provisions that even have to do with the international law of war. Combat robots can also be used by so-called “non-combatants,” i.e., free-armed forces, terrorists, criminals, civil groups etc. It makes no sense to restrict the use of autonomous weapon systems to certain user groups.

This also brings us to an essential distinguishing feature of a combat robot. They are autonomous weapon systems. They can achieve a weapon effect to kill or injure humans and they do this basically autonomously. 

This is an essential determination, especially in terms of weapon effect, or weapon capability. Indeed, there are quite a few robots (even within the military) that are not designed to kill or injure anyone. This is not insignificant. It is controversial in individual countries to use combat robots with lethal effects, while the use of unarmed assistants is usually not a problem.

The second important determination is that of the degree of autonomy. Strictly speaking, weapon systems that kill humans automatically have existed for centuries. These are booby traps, barriers, and biological and chemical weapons. They are used to have a permanent lethal effect, either at a distance or over time, without any further action on the part of a soldier. Thus, even today, the regions of the Balkan wars are littered with minefields that are unclearly mapped. 

Similarly, one must distinguish from the concept of combat robots such weapons systems that function merely by remote control. These remotely controlled weapons or observation systems are usually added to the “robot” category. But this type of warfare has also existed for many years. The German “Goliath” miniature tank from World War II was one such weapon system. It could be guided by radio or cable into enemy positions and detonated there. All missiles with targeting radar would then also be robots. A human-controlled drone is indeed a technical advancement. However, it still follows this principle.

AI weapon systems

Therefore, the term “combat robot” should include only those weapon systems that have a lethal effect while having such a high degree of autonomy that the decision to use weapons (i.e., to kill people) is made by the system itself. The term that is commonly used in this regard is AI – Artificial Intelligence. We should not discuss here whether AI is really independent and intelligent in itself. Let’s have a closer look on what these weapon systems distinguish from others. This includes active friend-or-foe detection. If a system does not have such a feature, it is no different from a conventional mine barrier. There, every soldier who steps on a mine is killed without exception – regardless of which nation or army he belongs to.

Similarly, whether human intellect and will ultimately make the decision to trigger the weapon makes a significant difference. Within NATO, and especially with the multiple developments toward a Future Combat Air System (FCAS), this is a moral line that one does not want to cross within the Western community of values. 

Wolfgang Koch, one of the leading German scholars who has studied the ethical use of autonomous weapons systems, describes it this way: „The concepts of mind and will and, therefore, of consciousness and responsibility bring natural beings into view that are ‘somebody’ and not ‘something’, i.e., persons. Cognitive and volitive assistance systems on the other hand, no matter the degree of technical sophistication they attain, are and will always be ‘something.’ It seems important to stress the dichotomy ‘something vs. somebody’ to counter both the exaggerated expectations and excessive fears that seem rampant in the public concern about automation in the military. “

It follows that the new autonomous weapon systems based on so-called “artificial intelligence” cannot be referred to as combat robots. In principle, they are nothing more than the technically sophisticated continuation of semi-autonomous helpers that support humans in achieving a weapon effect quickly and everywhere. But – and this is the crucial difference – these systems are far removed from an autonomous decision about life and death. Nevertheless, it should not be left unmentioned that these systems could certainly be transformed into fully autonomous combat robots in the future with slight technical modifications. 

“Real” combat robots

However, these types of combat robots already exist today – even in the Western Hemisphere. The intelligent weapon systems that really make a difference are the so-called “Lethal autonomous weapon systems” (LAWS). They are considered the “third revolution in warfare.” Whereby it is not possible to speak of a revolution. One of these fully automatic weapon systems is the STM Kargu-2, which is produced in Turkey.

The core competence of these systems is “Identification Friend or Foe” (IFF). All the world’s armies have an interest in distinguishing their own soldiers from the enemy on an opaque battlefield. From the standards of early wars to the uniforms of soldiers with their clear identifiers of rank and nation to modern technological developments, warring parties are always concerned with avoiding being “killed by friendly fire” whenever possible. A good overview of the various IFF categories is provided by BAE Systems.

There are numerous technological solutions, for example the AN/APX-119 IFF digital transponder from Raytheon. This is used not least in drones of the American border protection authorities. This is a sign that the use of high technology and robotic systems is not just reserved for the military.

Modern systems, such as the STU Kargu-2, even go so far as to offer facial recognition for target detection. This is very helpful if one wants to liquidate individuals such as statesmen or high-ranking military officers with such a weapon system. Such systems are in a permanent race with technological developments that want to disrupt or confuse exactly this IFF. 

But what are currently probably the most advanced robotic systems in armed forces. This video gives an impression:

https://www.youtube.com/watch?v=uXGj1kZnFEg

And here’s another list of combat robots currently in use in armed forces:

More information on the most advanced combat robots is here: https://www.analyticsinsight.net/top-advanced-military-robots-world/

So, as we see the relatively unspecific term “military robot” or “combat robot” isn’t very helpful. On one side it overestimates the development and the capabilities of the technologies used in the systems. On the other hand, it underestimates the narrative that is behind all these technologies: It is a question of morality whether the IFF is deployed as broadly and openly as possible, or whether there continues to be a human option to trigger the command. So, it is good that this moral, ethical political debate is going on in some countries. But it should not obscure the fact that some countries are just not having such debates. In the end, it remains a military consideration how to oppose these countries and their capabilities. The next arms race is in full swing, and it will not be limited by technology, but by the different readiness to use these technologies.

If you want to read more on the issue at vernetztesicherheit.de, please consult the article (in German) “Warum Cyborgs nur ein Zwischenschritt sind”

PAGE TOP