Published: 2 July 2020
“Russian propaganda is constantly shifting strategies, focusing more and more on amplifying messages rather than creating its own. So it’s both evolving and becoming less detectable at the same time,” – Rumena Filipova, research fellow at the Center for the Study of Democracy.
Russian disinformation is increasingly influencing European public discourse in progressively sophisticated and subtle ways. A partnership led by Sofia-based think tank Center for the Study of Democracy (CSD), and supported by Civitates, examines the dynamics of Russian disinformation in five countries across Europe.
Rumena Filipova, a research fellow at CSD, is leading one of the first major studies examining Russian disinformation in both Western and Central-Eastern Europe.
What are you and your partners aiming for?
Our objective is to draw conclusions about the comparative vulnerabilities of Bulgaria, Poland, Czech Republic, Germany and France to Russian disinformation as well as to identify effective responses. Russia has a certain pattern and a set of overall goals for disseminating its messages, and its tools of influence are increasingly working through local actors in a variety of ways.
We are working on an evidence-based guidebook of strategies to combat disinformation for a broad group of governmental and civil society actors. All of us share a great enthusiasm for the project. Our main motivation stems from the comparative aspect: the most exciting part is to see that our countries exist within a broader context and to see the similarities and differences.
What have you found so far?
One of the key findings emerging across the five countries is that economic and political trends and ties shape propagandist messages. The most common trend is that the more politically and economically connected a given local outlet is with Russian groups and interests, the more straightforward Russian messages are pushed.
A major Europe-wide problem is the transparency of media ownership. This murky Russian influence goes through financial ties, economic ties and beneficial ownership of news outlets, and this has to be revealed to the public. The Russian tactics, however, differ in countries according to historical relations and local political and economic factors.
For instance, to facilitate the dissemination of propagandist messages in Central and East European states, Russia capitalises on historically continuous and deeply embedded channels of influence. These include long-standing political-oligarchic and intelligence networks dating back to the communist period as well as traditionally pro-Russian societal attitudes.
In contrast, in Western European countries such as France and Germany, the Kremlin mostly operates through fringe actors (including far-right and far-left groups and individuals). Moreover, its disinformation goals are focused on influencing discourse regarding specific issue areas (e.g. migration) and political trends (such as sowing distrust in liberal democracy and supporting Moscow’s preferred candidates during elections).
Additionally, during the Covid-19 pandemic, we have observed that Russia and China’s disinformation strategies and messages increasingly overlap in seeking to undermine faith in EU and NATO solidarity. They both aggressively promote the narrative that authoritarian regimes are better able than democracies to cope with the health crisis (or indeed with any other crisis). This entails the need for European governments and institutions to respond robustly to the ‘authoritarian challenge’ through an effective coherent policy seeking to expose disinformation, as well as related economic, security and other risks.
What will the results be used for?
The case studies for each country as well as a comparative analysis and policy proposals will be compiled in a guidebook that is to be launched in Brussels this autumn. The event will gather diplomats, experts, media, civil society, think tanks and academics, reflecting on the project’s holistic and cooperative approach. We see the publication and the event as tools that can help to achieve change: they will set the stage for continued engagement with stakeholders and contribute to policy impact on the European level. We expect the results to build more connections and identify effective responses, both nationally and across the continent.
How do you analyse large quantities of messages?
We have joined forces with a technology partner Sensika to increase the scope of analysis of online news sources and social media. This Artificial Intelligence-powered tool is designed to study disinformation, particularly to monitor propagandist narratives in real-time. It examines massive amounts of data to look for trends in engagement metrics, complementing our researchers’ analysis of what messages are being promoted. So the trends that technology finds are combined with our more interpretive work – analysing the messages, the context, and the attitudes which cannot be easily detected by a machine.
What do you personally hope with regard to the outcome of the project?
My hope is for the guidebook to become a source of cross-European referencing about vulnerabilities and strengths that countries share, and that it serves a knowledge kit about foreign authoritarian state disinformation. I also hope that our comparative research will be further built upon. So that we continue to forge a pan-European approach to the anti-democratic threats that we are facing in the (online) media environment.