The Kremlin’s playbook for its war on Ukraine relies on an utterly distorted portrayal of events. During the past year, Moscow’s rationalization of its violence in Ukraine – already hydra-like since its 2014 illegal annexation of Crimea and invasion of the Donbas region — has sprouted a number of narratives, including the frequent smear of Ukrainians as “Nazis” or “Satanists” and repeated allusions to illusory bioweapon labs. Yet despite these pervasive efforts at manipulation, it appears that Ukraine and its allies are, in key respects, winning the war in the information space across most of the transatlantic community.

To achieve this success, Ukrainians have employed critical adaptations to identify and overcome Russian government efforts to simultaneously delegitimize the idea of Ukraine as a sovereign and democratic state, legitimize and obscure the violence and depredations of the Russian invasion, and ultimately fracture the democratic alliance.

In its fight against the Kremlin’s distortions, Ukrainian society has seized three key advantages — deep preparation, networks of cooperation, and active utilization of new technology, including artificial intelligence — that have aided civil society organizations and governments to build trust and tell Ukraine’s story, unite Ukrainians and their allies, and ensure resilience in the face of a pervasive authoritarian disinformation campaign. These efforts have fortified public support across much of Europe and the United States, support that has been critical to Ukrainians’ ability to maintain the integrity of their State and defend themselves on the battlefield in the face of Moscow’s assault.

The Importance of Preparation

Ukrainians have learned to mitigate risks in the information space through hard-won experience. Moscow’s use of disinformation in Ukraine goes back to Soviet times and extends to Russian influence activities in the aftermath of the union’s 1991 collapse. As the Kremlin’s encroachments have intensified since 2014, Ukraine’s civil society, news media, and activist community have built their capacity, working with democratic partners to counter malign authoritarian narratives.

Ukrainians trace the origins of this response to the Revolution of Dignity in late 2013 and early 2014, when Kremlin campaigns smeared the Euromaidan’s pro-Europe protesters as fascists and neo-Nazis in order to lay the groundwork for the annexation of Crimea and rolling invasion of eastern Ukraine. Many of the same narratives were redeployed to explain the full-scale invasion in February 2022.

As Galyna Petrenko of Ukrainian civil society organization Detector Media observes in a forthcoming report that we author (now available here), the period between 2014 and 2022 represented a crucial stage of development for the counter-disinformation community in Ukraine, during which its ability to coordinate and build essential response mechanisms matured. During this time, many Ukrainian organizations shifted their focus to securing the integrity of the information space and began experimenting with new methods of countering Kremlin narratives. They improved collaboration and information sharing to build a community of trust, educating the Ukrainian public about Moscow’s goals in the information space and strengthening citizens’ media literacy and resilience to manipulation.

The emerging critical mass of activity to counter the Kremlin’s information distortions had the effect of greatly improving the Ukrainian public’s knowledge and awareness of Russian narratives and tactics. For example, according to national survey data, the percentage of Ukrainians who understood the war in Donbas to be the result of Russian aggression increased from 49 percent in February 2019 to 65 percent in December 2021.

 Deepening Cooperation

In an increasingly complex information space, Ukrainian civil society organizations have leveraged diverse skill sets to form cooperative networks that have the scale, sophistication, and speed necessary to stay ahead of the adaptations in messaging strategy churned out by the Kremlin’s multiheaded propaganda machine.

Such collaboration includes the efforts of data scientists, narrative researchers, web-traffic analysts, web marketers, sociologists, and investigative journalists. Through dedicated information-sharing across sectors, these networks can identify disinformation narratives and design timely, effective responses. They also help civil society organizations achieve valuable economies of scale that would otherwise be out of reach.

Given the tendency of disinformation narratives to cross platforms and outlets to reach diverse audiences, cooperation across and between sectors is critical to building the capacity to resist and counter disinformation. The National Democratic Institute’s Counter Disinformation Coordination Hub – an adaptable network of roughly 25 local civil society groups, media organizations, and international organizations – has provided a platform to Ukrainian civil society organizations as they sought to react to Moscow’s pivot to “hyperlocal” disinformation campaigns. In those operations, Russian state-controlled outlets have attempted to influence Ukrainians through content-sharing agreements with cash-strapped local news outlets and localized channels on Telegram, an encrypted messaging application that has become one of the most popular information sources in Ukraine since the start of the full-scale invasion.

The Hub also connects Ukrainian civil society organizations to local journalists across the country– many of whom are operating in active conflict zones. This allows them to better understand the dynamics of Russian disinformation operations in context and design localized messaging in response.

Leveraging New Technologies

It would be difficult, if not impossible, for even the most well-staffed civil society organizations to directly monitor emergent disinformation narratives across the global media ecosystem. However, artificial intelligence (AI) and machine-learning tools have made it easier to rapidly detect patterns across a complex and evolving information ecosystem. This, in turn, empowers disinformation researchers to quickly pick up on emergent Russian narratives, and gives messaging specialists more time to design an effective response before narratives cross channels and platforms to influence larger audiences. Further, by facilitating historical analysis of Russian disinformation over time, AI and machine-learning tools enable counter-disinformation specialists to predict future campaigns on the basis of societal fault lines, cultural tropes, annual events, and historical knowledge.

For example, the organization Texty in Ukraine uses AI and machine-learning tools to identify new pro-Kremlin narratives across a number of platforms, including Telegram. Texty has used its advanced technology to rapidly perform analyses of thousands of Telegram channels, where Russian narratives may be impactful but otherwise difficult to identify and counter.

Detector Media, mentioned previously, uses AI and machine learning to better understand Moscow’s efforts in information ecosystems beyond the transatlantic community. Cooperating closely with LetsData, a Ukrainian private-sector firm that provides AI and machine-learning services, Detector performs real-time discourse monitoring in more than 30 countries. It is possible to do this work manually, but an algorithm can detect in 10 seconds what might take an unassisted researcher an hour (or longer) to discover.

By coordinating such AI-driven narrative and audience research, public polling data, and focus groups, counter-disinformation networks can direct their efforts to the specific narratives that are empirically gaining the most traction among crucial audiences, and create narrowly tailored responses that reach the right people.

One challenge of implementing technology-driven approaches more broadly within Ukraine and beyond hinges on the difficulty of hiring talented AI engineers away from the private sector, where compensation is greater. Moreover, digital-rights activists raise valid concerns about the threats that AI-driven tools such as ChatGPT pose to the integrity of the information space, since such tools may be used to automatically generate convincing disinformation at scale.

Russia’s Larger Information Ambitions

The Russian authorities’ ambitions in the information domain are global and therefore relevant to all free societies. Over the past year, far beyond Ukraine, the Kremlin has wielded disinformation to blame Kyiv or NATO countries for the conflict, or otherwise dampen support for Ukraine’s cause. Such investments in the information space have yielded far greater results in regions such as Latin America and Africa, where the Kremlin’s toxic messaging goes virtually unchallenged as a result of political, economic, and historical ties to Moscow in addition to less awareness about the dangers to democracy that such narratives pose.

There is little evidence to suggest that the leadership in Moscow or like-minded authoritarian regimes will change course in their disinformation efforts. Given the significant payoff derived from their relatively inexpensive and low-risk disinformation activities to date, these regimes can be expected to continue to exploit asymmetries that enable them to sow confusion in information spaces around the world.

The threat Moscow’s disinformation machine poses is clear. While its claims about Ukraine typically defy observable reality, they are a critical component of the Kremlin’s information strategy, which aims to unmoor societal perceptions from fact-based reporting and experience. Through such activities, Russian leadership undermines the very concept of knowable truth. Philippine journalist and Nobel Peace Prize laureate Maria Ressa puts the risk such actions pose into stark relief: “Without truth, you can’t have trust. Without trust, we have no shared reality, no democracy, and it becomes impossible to deal with our world’s existential problems.” Given the high stakes, democratic societies must work together more intensively with and in support of Ukraine to see through its vision for victory, and to apply the relevant lessons learned from the conflict to democratic societies elsewhere.

(This article is drawn from a report, now available: “Shielding Democracy: Civil Society Adaptations to Kremlin Disinformation About Ukraine.”)

IMAGE: This illustration photo taken on February 14, 2023 shows a phone screen displaying a picture of rescuers working on a residential building destroyed after a missile strike, in Dnipro, Ukraine, on January 16, 2023, with the WarOnFakes.com website in the background displaying a fake video of the same residential building. A Russian missile smashed a Ukrainian apartment complex, killing dozens. Pro-Russian propagandists offered a slick counter narrative that shifted the blame away from Moscow — using pseudo fact-checking as a tool of disinformation. (Photo by OLIVIER DOULIERY/AFP via Getty Images)