Getting your Trinity Audio player ready...
|
On 29 January 2023, Dmitry Rogozin, the former head of Russian space agency and now head of the “Tsar’s Wolves” group of military advisers, claimed that four Marker robotic ground systems are being prepared to be tested by the Russian armed forces on the battlefield in Ukraine.
Previously, Russian state media described the Marker as a vehicle integrating “the most advanced” autonomous driving capabilities in Russia, using artificial intelligence (AI)-based object recognition, and processing data via neural network algorithms. The system was reportedly tested in a “fully autonomous” navigation mode to patrol the Vostochny cosmodrome. Now, Rogozin asserts, the Marker will undergo a “baptism of fire” in combat and will be able to automatically detect and target Abrams and Leopard tanks, once Ukraine receives them from its allies.
Such ambitious statements about Russia’s capabilities in the sphere of military AI and autonomy are not new. The Russian leadership, government, state media, and weapon manufacturers have put much effort into constructing and maintaining a narrative about Russia’s military modernization. As part of this narrative, robotic systems—especially uncrewed aerial vehicles (UAVs), or drones — as well as the integration of AI and autonomy into the armed forces have been considered key priorities.
Russia’s full-scale invasion of Ukraine, however, revealed the mismatch between the narrative Moscow has been promoting and the reality of Russian military technological capabilities. This gap, perhaps unexpected for what was described as the “second army of the world”, is particularly visible through the ‘low-tech’ character of the Russian armed forces and their limited use of robotics and AI in weapon systems.
Moreover, the gap is likely to increasingly widen as Russia’s technological development is hampered by, among others, the overall state of the Russian economy, sanctions, and the ‘brain drain’ of information technology (IT) specialists. Although officials such as Rogozin continue to advertise Russia’s military AI capabilities, these disinformation and PR efforts about Russia’s supposedly sophisticated and indestructible army are no longer credible.
AI and Autonomy in Russian UAVs
As part of its self-perception of great power in competition with the US and NATO, since the 2010s Russia has tried to demonstrate that it was able to catch up with, and surpass, its rivals in military technology. Drones and AI have been an important part of the political and military leadership’s narrative. They have not only been associated with strategic advantages, but also with symbols of a modern military. To prove that its army is ready for 21st century combat, Russia has engaged in performative demonstrations via, among others, military parades, strategic exercises, and propaganda surrounding Moscow’s campaigns and operations. As Defense Minister Sergey Shoigu admitted in 2021, all latest Russian weapon systems have been tested in Syria. Russian state media particularly advertised the use of robotic ground systems such as the Uran-9 and loitering munitions in the Syrian civil war.
Statements from Russian officials might have created some expectations about Russia displaying increasingly autonomous and AI-based weapons on the battlefield. In April 2020, the director of the Advanced Research Foundation, Russia’s version of the US DARPA and developer of the Marker, said that Russia, as other states, is moving towards replacing soldiers with more efficient and precise robotic systems. In May 2021, Shoigu claimed that Russia started its serial production of “combat robots capable of fighting on their own”. President Vladimir Putin stated in November 2021 that Russia has more than 2000 UAVs in service, including with the integration of AI.
The information available about Russian weapon systems used in the invasion of Ukraine shows a much different picture than what advertised prior to 2022. The levels of autonomy and integration of AI in UAVs used by Russia have so far been limited.
For instance, the KUB UAV – a loitering munition produced by ZALA Aero, part of the Kalashnikov Group, which is a subsidiary of the Rostec state corporation – is often referred to as part of debates surrounding weapon systems with “concerning autonomous capabilities”. Photographic evidence of the use of the KUB in Kyiv appeared in March and was followed by several media and analysis articles suggesting that Russia might have used a “killer drone” or “killer robot” – the colloquial term for autonomous weapon systems, which are capable of selecting and engaging targets, i.e., using force, without human intervention. However, Russian sources do not indicate that the KUB has autonomy in these ‘critical’ functions. Rather, it would autonomously navigate towards a target which would have been pre-selected by a human.
Similarly, the Russian adapted version of the Iranian Shahed-131 and -136 UAVs (often named “suicide drones” or “kamikaze drones” as they self-destruct when attacking a target) do not seem to integrate automated target recognition algorithms or possess sensors. As Dominika Kunertova points out, these systems “provide Russia with the ability to strike targets deep in Ukrainian territory and on the cheap”. Although many of the Shahed attacks launched by Russia have reached their goals, the systems are described as “primitive” and unsophisticated “weapons for mass terrorist acts against civilians”.
Meanwhile, the Lancet-3 loitering munition, also manufactured by ZALA Aero and used in Ukraine (and previously employed in Syria), has received much acclaim from Russian defense analysts. They believe this is the most efficient Russian UAV. The Lancets have been involved in some effective attacks in recent weeks, but it remains unclear how many of them Russia could produce.
Lancet reportedly has a system of cameras and sensors which could allow it to “locate a target without human guidance”. Rostec officials have been promoting the Lancet as “highly autonomous”, stating that the system “carries an optical-electronic system that helps independently ferret out and destroy a target”. These drones are also said to be capable of returning back if the target has not been found.
Yet, as pointed out by Zachary Kallenborn, it is challenging for outside observers to verify whether these, or any, systems have been used in a fully autonomous mode in an attack. For instance, while a UN report published in 2021 states that Kargu-2 loitering munitions had been used “to attack targets without requiring data connectivity between the operator and the munition” in the Libyan Civil War, there are ongoing debates about the implication of this assertion.
It is also difficult to evaluate manufacturers’ claims about systems’ capabilities in terms of AI and autonomy. This is further complicated by the ambiguity of the term ‘AI’, which is often used to refer to various spheres including robotics, machine leaning, and different types of algorithms. However, the vagueness surrounding the definition of AI has also allowed Russian officials to attribute AI technologies to their weapon systems as part of their ongoing information war. Official statements often refer to ‘AI’ as a monolith and coherent phenomenon without explaining the exact meaning of the term, with the ambition of portraying a system, and the whole military, as ultra-modern.
Despite the visible mismatch between Russian narratives and capabilities revealed in the invasion of Ukraine, the leadership, in addition to experts, state media, and other actors, have continued to maintain the official discourse surrounding drones and AI. This information war effort seems to have two main components and objectives.
AI in Russia’s Great Power Self-Portrayal
First, part of the narrative, most likely oriented towards domestic consumption, seeks to maintain Russia’s self-attributed great power status by claiming to possess and use the most modern weapons. Putin often mentions (see September 2018, November 2021, August 2022) that Russia possesses many weapons, including “high-precision and robotic” systems, which are “ahead of foreign counterparts” and have “no analogues around the world”. Other officials have been saying that all military objectives will be reached or denying Russia’s visible problems with weaponry. The CEO of Rostec, Sergey Chemezov, said in an interview in October that the corporation is ready to supply the KUB and Lancet loitering munitions in the “required amounts, if there is an order” from the Ministry of Defense.
In July, Deputy Prime Minister Denis Manturov highlighted that the production of UAVs and of “high-precision and other new types of weapons and technology” remained a special priority for the government. At a Defense Ministry Board meeting on 21 December 2022, Putin called to integrate AI technologies at “all levels of decision-making” in the armed forces, adding, “as shows experience, including from recent months, the most effective weapon systems are those that operate quickly and almost in an automatic mode”. He noted, “We have no funding restrictions. The country, the government give everything that the army asks for, everything”, in an effort to signal that Russia’s military and economic capabilities have remained unaffected by the invasion of Ukraine and its aftermath – in contrast to expert observations that both have severely deteriorated.
It remains unclear to what extent the Russian population buys into this narrative. To the Ukrainian and most foreign audiences, Russian claims have been discredited by reports and observations about the state of its army and defence industry, revealing Russia’s true character of a “Potemkin superpower”. The Ukrainian armed forces were visibly more successful at acquiring and using modern technologies such as drones.
When it comes to AI, Russia was already behind its own objectives due to issues such as a lack of investment, publications, cooperation with the private sector, as well as access to crucial hardware such as microchips. Sanctions, export controls, the departure of companies and IT specialists, as well as other measures affecting Russian technological capabilities, risk widening the gap between officials’ statements and the reality.
AI in Russia’s International Law Violations
Second, Russian officials claim to have been using weapon systems with AI and autonomous features to defend what they see as the legitimacy of their strikes on Ukraine. Drones and AI technologies are often associated with precision, efficiency, and accuracy. Militaries and governments around the world are justifying their investments into military AI by stating that they will allow for more ‘precise’ and thus more ‘ethical’ warfare.
The Russian official discourse has been making use of such associations to portray its attacks as highly precise and respectful of international humanitarian law (IHL), which requires, among others, to make distinctions between civilians and combatants in warfare. As part of ongoing global debates on whether, or how, military AI can be used in compliance with IHL, such simplistic arguments have been criticized for not reflecting the state of the technology and the complexities of decision-making in warfare. In other words, algorithms are unlikely to be able to satisfy the requirement for situational awareness for legally accountable warfare.
Shoigu said in April 2022 that the launching and target designation systems of “high-precision missiles” Kalibr and Kinzhal have become more efficient due to the use of modern technologies, including “the integration of AI elements”. The same Kalibr cruise missiles killed more than 20 civilians in Vinnytsia in July 2022.
In August 2022 the Ministry of Defense announced the establishment of a new department for the development of AI technologies. Its newly appointed head, Vasily Yelistratov, declared that AI is “present in all weapons, especially in high-precision ones,” while the war of the future will be “a war of machines”. He has also suggested that AI makes weapons more intelligent, “and the more intelligent are weapons, the less losses will be sustained”, noting that “the most valuable thing is human life”. Such statements are contrasted with Russia’s disregard for the lives of Ukrainians.
Similar claims have been made in relation to the anti-personnel mine POM-3 or “Medalion”, also used in Ukraine. It has been previously described by state media as a “smart” mine with a “nanobrain” which integrates AI technologies. The manufacturer’s representatives claimed that the foundation of this “brain” is entirely made in Russia, meaning that potential enemies could not hack into it or understand how it works. They also said the mine would be able to independently distinguish between military personnel and civilians. As noted by experts Toby Walsh and Lauren Kahn, it is highly unlikely to be technically possible.
Yet, Russian officials continue to use AI in their claims to try to portray its actions as supposedly motivated by international law – although it is not party to the Mine Ban Treaty. Their declarations are contradicted by Russia’s constant violations of the core purpose and spirit of the Convention on Certain Conventional Weapons (CCW), which is to minimize the unnecessary and unjustifiable suffering of both civilians and combatants. Russian indiscriminate attacks on shopping malls and apartment buildings go against the very essence of international legal principles Russian diplomats affirm to care about at the UN.
Russian military officials use the concept of AI and its inherent ambiguity to portray their attacks as motivated by humanitarian concerns such as only attacking military targets and not striking civilians. The actions of the Russian armed forces, however, discredit these claims of saving human lives with supposedly precise weaponry. In reality, Russian attacks have been targeting civilian objects, resulting in the deaths of civilians and the destruction of schools, hospitals, shopping malls, playgrounds and other non-military targets. Russian officials rarely seem to care for Ukrainian civilians affected by their strikes, although they task propagandists with try to justify these attacks via propaganda campaigns.
Takeaways for the Study of Military AI and Autonomy
Russia’s military modernization narrative, of which drones and AI are a key element, has been discredited by its ‘low-tech’ invasion of Ukraine. Russian officials’ claims to be able to be integrating AI into the military or launching a domestic production of drones are unlikely to be convincing. Their arguments of AI supposedly assisting in conducting ‘precise strikes which save lives’ are even less believable.
Thinking beyond Russia, however, arguments that weapon systems with autonomy or AI-based autonomy have become a constant part of the battlefield should be treated with care. As Peter Burt writes, our understanding of such systems “is vague and based on manufacturer’s claims which are almost certainly exaggerated”. There is much ‘hype’ surrounding AI technologies and autonomy. At the same time, military AI and its integration into weapon systems are not sufficiently regulated at the global level.
In their defense of their country’s sovereignty, Ukrainian officials have also shown an interest towards increasing autonomy in weapon systems. A report from the Associated Press published on 3 January 2023 quotes the Ukrainian Digital Transformation Minister, Mykhailo Fedorov, saying that Ukraine has been engaged in “a lot of R&D [research and development]” in “fully autonomous killer drones”. Fedorov added, “I think that the potential for this is great in the next six months”.
The continuous development of weapon systems with AI and autonomy by armed forces around the world, combined with the absence of specific international regulation, calls for a closer examination of developments and state practices in that area. Efforts in debating these trends are underway and should be pursued further. Yet, as seen with observations of Russia’s capabilities and as argued by Margarita Konaev, more attention should be paid to the differences in claims made by various actors and actual practices in developing, testing, and using weapon systems, including drones, which integrate AI technologies and autonomous features.
Anna Nadibaidze (@AnnaRNad) is a PhD candidate in International Relations at the Center for War Studies at the University of Southern Denmark (SDU). She is a researcher in the AutoNorms project, funded by the European Research Council.
Comments are closed.