Getting your Trinity Audio player ready...
|
The Russian-Ukrainian war, triggered by the Russian invasion of 2022, is not limited to military confrontations. At the heart of this modern war lies an invisible but decisive battlefield: that of information. Narrative management, to mobilize both domestic and international public opinion, has become a central weapon for both Ukraine and Russia.
The significant growth of social networks, and the use of disinformation as a political strategy, raise major questions for countries like Canada. How can we preserve our digital sovereignty in the face of foreign interference? What strategies can be adopted to protect citizens from information manipulation and strengthen their resilience in the face of fake news? Understanding how Ukraine secured its digital space, while thwarting Russian disinformation campaigns, offers crucial lessons on digital sovereignty and the protection of informational space.
Narratives in conflict: facing international opinion
Ukrainian narratives are articulated around Ukrainian defense and communication strategies. A striking example of how contemporary leaders use rhetoric to influence public opinion and strengthen their position is that of Volodymyr Zelensky, who made innovative use of social networks, exploiting communication centered on three key elements: personalization of the message, use of emotional rhetoric and mastery of global media space. On the one hand, by broadcasting short, emotional messages to the Ukrainian public, he consolidated his country’s domestic resilience. On the other, a constant presence in the international media, while adapting his discourse to the cultures and historical references of foreign parliaments. The global media narrative has thus positioned Ukraine as a legitimate victim in need of international support.
In military manuals, the principle of surprise is fundamental: revealing offensives in advance diminishes their impact. In Ukraine, however, communication about offensives seems to be part of a political strategy. Dependent on Western support, Ukraine uses these announcements to secure the continued support of its partners by showing that it retains prospects of success. Although the Russian General Staff is probably aware of these operations thanks to its intelligence capabilities, these statements are primarily aimed at public opinion and international partners.
Russia, for its part, has developed a narrative centered on the defense of its strategic interests and the protection of certain Russian-speaking populations, thus justifying its actions in Ukraine. This narrative has been disseminated both at home and abroad, in an attempt to win the sympathy, or at least the neutrality, of other nations. It is important to note that Russia presents its actions as a “special operation”, seeking to minimize the scale of the offensive. This choice of terminology also aims to avoid the legal implications that the term “war” could entail, both domestically and internationally. It has also exploited historical references, such as the fight against Nazism during the Second World War, to legitimize its actions in the eyes of its own population and certain segments of international opinion. In particular, the Russian government denounces the association of Ukrainian forces with controversial figures such as the Azov Regiment, or the popularity of certain personalities such as Stepan Bandera.
The war in Ukraine bears striking similarities to other conflicts, notably the situation in Gaza. In both cases, governments use narrative as a central element of their military strategy, seeking international support for their actions, despite criticism of international law. In Gaza, as in Ukraine, the narrative is built around the legitimacy of national defense in the face of a perceived threat, thereby justifying controversial military actions. In the context of the Israeli-Palestinian conflict, a recent example of disinformation emerged after the Hamas attack on October 7, 2023. Unverified claims by the Israeli media that “40 decapitated babies” had been discovered in a severely hit area. Although this story was relayed by various media and widely circulated internationally, it was later proven to be unfounded. Using images or stories of children in war is a powerful strategy, as it taps into a universal emotional register and symbolizes innocence. By portraying the enemy as insensitive to this innocence, these stories aim to demonize him in the eyes of the public, reinforcing the idea that he represents such an extreme threat that he does not even respect the lives of the most vulnerable.
This kind of discourse is reminiscent of past examples, such as the “babies ripped from incubators” in Kuwait before the Gulf War in 1990, when dramatic testimonies were used to influence public opinion in favor of military intervention. Thus, the information warfare surrounding these children’s stories not only attracts empathy, but also serves to dehumanize the enemy by reinforcing the international public’s sense of moral urgency.
The conflict in the Middle East has largely captured the attention of the American media, relegating the war in Ukraine to the background. This discrepancy contributed to a gradual loss of public interest in the situation in Ukraine, already present in political and media debates.
While strategic analyst Jacques Baud argues that narratives directly shape military strategies, other analysts, such as British military historian Lawrence Freedman, suggest that the manipulation of information can sometimes fail to influence the outcome of conflicts. For example, Freedman points out that despite intense US propaganda prior to the Iraq war, the rapid collapse of Saddam Hussein’s regime did not unfold as strategists had predicted.
This phenomenon is a good illustration of how war communication can be exploited to shape the perception of reality in a conflict context, reinforcing the idea that “it’s not so much what people think, but what they think about”, as Maxwell E. McCombs and Donald L. Shaw point out in their theory of agenda-setting.
Controlling and protecting Ukrainian information space
This apocryphal quotation from Rudyard Kipling remains pertinent: “The first casualty of war is always the truth.” Since the 1990s, Ukraine has been a playground for Russian influence, particularly through the media and digital infrastructures. However, it was after the events of 2014, with the annexation of Crimea and the conflict in the Donbass, that Ukraine began to perceive its informational space as a domain to be protected, on a par with its physical territory.
Since 2017, Ukraine has blocked Russian resources such as Yandex and VKontakte, considering them to be disinformation vectors. This policy intensified after 2022, with measures to secure the digital infrastructure against cyberattacks, including the relocation of public databases to servers in neighboring countries. This blockade enabled Ukraine to limit Russian influence on its own digital territory and reduce the spread of pro-Russian narratives among the Ukrainian population.
Under the martial law introduced on February 24, 2022, a decree signed by Volodymyr Zelensky consolidated the national TV channels into a single platform for streaming war news. This was followed by a ban on the broadcasting of the Sputnik and RT channels, a decision taken by the European Union in support of this blocking of Russian influence, despite the fact that the channels are only a few clicks away on the Internet.
The development of an online Ukrainian cultural space, such as Ukraine.ua, aims to counter the historical domination of Russian content in Ukraine and build digital sovereignty. The creation of independent Ukrainian media and the promotion of local culture have helped build a distinct Ukrainian narrative, strengthening national resilience in the face of external influences.
Disinformation and digital sovereignty: the battle of the social networks in Ukraine
Social networks have played a crucial role in disseminating and amplifying these narratives. Ukraine has used Twitter, TikTok and other platforms to maintain international support, spread narratives of resilience, denounce war crimes, and reach a younger audience. The phenomenon of fake accounts used to propagate pro-Russian or pro-Ukrainian messages is a concrete but small example of the complex structure of the media in our era.
According to the Kiev Institute of Sociology, in October 2023, 44% of Ukrainians used Telegram as their primary source of information, surpassing television, used by 43% of citizens. Curiously, Ukraine has just recently banned the use of Telegram on all government personnel devices. The principle of freedom of expression, advocated by digital media such as X (formerly Twitter), can lead to ambiguity about the impact of disinformation. The situation in Brazil is a case in point. In August 2024, the Brazilian Supreme Court ordered the suspension of the X platform, criticizing it for failing to comply with judicial injunctions aimed at combating disinformation. The decision was prompted by X’s refusal to block accounts accused of disseminating false information, particularly in connection with supporters of former president Jair Bolsonaro.
In response, X took steps to comply with Brazilian legal requirements, including blocking the offending accounts and paying the fines imposed. These actions led the Supreme Court to lift the platform’s suspension in October 2024, allowing X to resume operations in Brazil. This case highlights the tensions between freedom of expression and the responsibility of digital platforms in managing misinformation, as well as the challenges faced by governments in regulating online content while respecting fundamental rights.
Initiatives such as StopFake.org, which aim to verify information and denounce fake news, illustrate how Ukraine is trying to reclaim its informational space, counting over 7,000 verified articles on the platform. Another innovative example is the use of the Diia application, initially designed to simplify administrative procedures. Since the Russian invasion, Diia has become a civilian intelligence tool, with Ukrainian citizens able to submit geotagged photos and videos of Russian military movements. This has enabled the population to actively participate in national defense by reporting suspicious events.
In a context of information warfare, who has the power to determine what is true? How can fact-checking initiatives avoid becoming propaganda tools themselves? Fact-checking often relies on the use of processes, metadata analysis, or the detailed examination of visual evidence. These methods are reinforced by specialized editing software. However, the question of the objectivity of fact-checking organizations is frequently raised, as they are sometimes accused of serving political interests. The once fundamental role of the journalist, who acted as a filter for information before it was disseminated to the public, has weakened in the digital age. Any individual can now produce and distribute content online, facilitating the spread of false information. In March 2022, a deepfake showing President Zelensky calling on Ukrainians to lay down their arms circulated on social networks, although it was quickly denied by the Ukrainian authorities.
The complexity of predicting future scenarios
The use of SIGINT (signals intelligence) and CYBER has not been absent from the Ukrainian conflict. According to the Institut Français de Géopolitique, electromagnetic flows play a crucial role in the conduct of modern military operations. With NATO support, Ukraine has gained access to advanced communications surveillance technologies, giving it a considerable advantage in the field.
In 2022, Ukraine used artificial intelligence tools, notably with the help of Palantir and its Gotham project, which enabled the aggregation of SIGINT. For its part, Starlink contributed with its satellite support and network coverage of Ukrainian territory. This made it possible to analyze Russian troop movements and predict future conflict zones. These predictions enabled the Ukrainian army to prepare for offensives, limiting human and material losses. However, Starlink’s reliability has sometimes been called into question. In some cases, the company restricted or interrupted its network, citing security concerns or strategic considerations. The Ukrainian army reported outages in areas taken over from Russia, provoking debates about reliance on private companies for critical military operations. At the same time, Russia itself has attempted to disrupt the Starlink connection in Ukraine, making access to the network unstable at times.
These interruptions have fuelled concerns about the company’s ability to provide constant support in wartime. Other sources indicate that Starlink’s performance in Ukraine, in terms of speed and coverage, also varies, posing additional challenges. In this sense, the situation highlights issues of Ukraine’s digital sovereignty and strategic autonomy, raising the risk that a private company could directly influence the availability of critical resources in times of conflict.
The “Discord leaks” revealed that the Russian Ministry of Defense had been infiltrated by US intelligence, reinforcing the idea that the Russian invasion had been anticipated long before it began. Not all European countries, notably Germany, France and Italy, anticipated the Russian invasion of Ukraine because of their economic ties with Russia. Germany was dependent on Russian gas, France was aiming for strategic cooperation, and Italy, also energy-dependent, had important trade links. The war revealed the risks of this blind trust in Moscow.
The right framework: military doctrine
Military doctrine is a strategic framework designed to align available resources with the specific objectives of each nation. Comparison with other conflicts provides lessons on how predictions can be distorted by misinformation. Prediction errors during the 2003 Iraq war, based on false information about weapons of mass destruction, show the extent to which strategic decisions can be influenced by narratives constructed to justify military action.
Similarly, the misinterpretation of the situation in Ukraine by certain experts shows the limits of strategic forecasting in a fog of war. It is essential to recognize that each army operates according to its own logic, dictated by its national doctrine, and yet some specialists from NATO countries are astonished that Russia does not use the same techniques to wage war. Attempting to understand the armed forces of one country based on the concepts of another’s military doctrine can lead to misinterpretation.
Others have shown that their assessments of battles or the performance of military equipment are often influenced by their own experiences, without considering the specific context of the adversary. Knowing the strengths and weaknesses of one’s own side is important, but Sun Tzu reminds us that it is just as crucial to understand how the enemy thinks, reacts and mobilizes resources. This point has been neglected in many military analyses, leading to inaccurate conclusions. Using various tools such as intelligence and reconnaissance, accurate information must be obtained. The battlefield must not be imagined but based on concrete facts. Claims that Russia was too weak to carry out its special mission, or to prolong the war, have proved totally unrealistic. Voices such as that of Michael Kofman, also emphasize that the Russian army, despite its logistical shortcomings, remains a force not to be underestimated, calling into question over-optimistic analyses of a rapid Russian collapse. Is Russia in a position of strength? With high production and falling consumption, it doesn’t seem ready to capitulate any time soon.
Regardless of the fog of war created by the narrative, whether in the media or the public sphere, an armed corps must trust the skills acquired during its training. This implies respecting its duties, within a framework provided by national military doctrine. In the event of major misinformation, this framework helps to maintain clarity in the psychological environment of soldiers, captains, and commanders, without exposing them to the fluctuating influences of misinformation. They are not distracted by outside stories but rely on concrete observations and first-hand knowledge of the field, a knowledge built up through experience and analysis of facts in the field.
In this way, the information passed on to decision-makers is more reliable and less exposed to the distortions of external narratives, enabling more appropriate decision-making rooted in reality.
Comments are closed.