Editor‘s note: Several vera.ai project partners and respective teams, supported by e.g. students from other entities and affiliations, participated in the 2023 Winter School and Data Sprint, organised by the University of Amsterdam’s Digital Methods Initiative. It took place from 9-13 January 2023 in Amsterdam. The occasion was also used to work on several projects and research undertakings that are part of the overall vera.ai activities, aims and goals. Here, a team run by the organisors themselves report about their project that dealt with the mapping of what they call “post-truth spaces” in the context of the war in Ukraine. 

Mapping post-truth spaces concerning the war in Ukraine

Since the intensification of the war in Ukraine in February 2022, there has been a plethora of evidence suggesting an increased effort to manipulate public perception about it. In the past twelve months, Ukraine, the Ukrainian government and ordinary citizens became the targets of attacks orchestrated by pro-Kremlin disinformation operatives.

Despite the undisputed evidence of unprovoked Russian aggression, some online communities are vulnerable to pro-Kremlin rhetoric that encourages them to question the conflict and its effects. The sensitivity of the topic doesn’t stop them from presenting the war as a “hoax” and Ukrainians as “crisis actors”.

We define these communities to belong to what we call post-truth spaces. Post-truth spaces are web and social-media areas where misbeliefs, mis-judgements or other epistemic-failures are not viewed as problematic. On the contrary, in these areas, alternative news and knowledge are deemed fresh and free from mainstream bias. They are welcomed as new sources of information on current events and geopolitics, including - but not limited to - the war in Ukraine.

At the 2023 Winter School organised by the Digital Methods Initiative (dmi) at the UvA, a team led by Prof. Richard Rogers, Maria Lompe, Emillie de Keulenaar and Kamila Koronska tested a “detection method” in a project they contributed. The method is to allow fact-checkers for faster detection of such post-truth spaces, and verification of actors that spread pro-Kremlin propaganda on social media.

To satisfy the vera.ai project requirements, the team was also responsible for testing platform or method-specific needs of researchers who study disinformation.

Visualisation of componentsUni Amsterdam / DMI Team

Separate from the team's main research agenda, they also embarked on a spontaneous research journey in which they looked into the “attention fatigue” concerning the war in Ukraine on Facebook. Aim: getting to know whether the topic still receives as much attention on Facebook as it did in the early days of the escalation of the Russo-Ukrainian war.

What we did during the DMI Winter School 2023

The “Mapping Post-truth spaces” team was thrilled to facilitate a week-long research project at the DMI Winter School 2023. The topic of this year's event was “The use and misuse of Open Source Intelligence (OSINT)”. We worked in a group of 12, including four co-facilitators whose work was to oversee the progress of the project and applied research techniques. In addition, participants of the Winter School had a chance to attend tutorials that allow them to learn about the DMI tools used for this project (4CAT), but also to widen their knowledge on topics such as NLP, by taking part in a tutorial organised by Emillie de Keulenaar.

We divided our workload into three sub-projects. A group of four students mapping post-truth spaces on Douin (also known as  the Chinese TikTok) (1). A group of eight students who looked into analysing post-truth spaces using the method mentioned above, on Facebook across different languages (Bulgarian, Lithuanian, Hungarian, Polish, Czech, Slovak, Dutch and Swedish) (2) and a smaller sub-research project, where everyone was involved to enquire the attention fatigue surrounding the conflict on Facebook (3).

Mapping post-truth: A cross-language perspective

We'd like to display content from YouTube here. If you're ok with that, please agree to the YouTube Terms of Service.

For each language we discovered different problematic / disinformation clusters and differences in the extent to which these post-truth spaces were prominent in the information landscape. In Czech there are big and central nodes of pro-Russian disinformation sources, whereas in Bulgaria, Slovakia, and Poland these sources were present, but in the periphery. In Czech the content is mostly accessible through sites with Russian domains, alt-right Facebook groups and pages and misinformation Telegram channels relating to the war in Ukraine. In Bulgarian the cluster of problematic / pro-Russian information is accessible through mainstream Russian sources (Russia Today or Kremlin.ru).

In Poland the knowledge graph is very dispersed, with no indication of prominent disinformation clusters. Mainstream media sources and crowdfunding sources dominate the information sphere there. Nonetheless, we found pro-Russian sources that were intertwined with mainstream media outlets. These were pro-Russian narratives shared by extreme right / conspiratorial conglomerates (Legaartis.pl, wRealu24.pl, nczas.com).

In Slovakia this content appears on Facebook through Slovak’s Alex Jones - Tibor Rostás, pro-Russian official media channels, including the page of the Russian Embassy, and so-called “investigative journalists” such as “Investigative journalism blog” Twitter OSINT farm (with the blue verification mark). Interestingly, a lot of pro-Kremlin content in Slovakia appears on Facebook through sources in Czech.

In other countries problematic / disinformation clusters were situated on the periphery. In Hungary, discourse about the war was driven by major news outlets, both pro-government and independent. We spotted a similar pattern in Lithuania where the majority of information about the war in Ukraine is posted on Lithuanian-speaking Facebook groups and pages, as well as verified pro-Ukrainian channels mostly led by activists and civic organisations. Only a few suspicious Facebook groups were spotted in our network (PRABUDIMAS 🌞 ПРОБУЖДЕНИЕ).

In the Netherlands problematic sources are also in the minority. Around prominent nodes we observed Dutch mainstream media outlets as well as multiple Dutch institutional websites and social media platforms. Similarly, in the Swedish media landscape the most prominent voices on the conflict are Swedish mainstream media sources, political parties and social media sites. No suspicious actors are present among the most-linked-to in the network as a whole, and there are few notable suspicious actors within the clusters around these larger nodes.

How has the attention to the war in Ukraine evolved over time, according to Facebook interactions on war-related content? 

We'd like to display content from YouTube here. If you're ok with that, please agree to the YouTube Terms of Service.

As pointed out above, the sub-research project portrayed next did an investigation into the engagement surrounding the war in Ukraine on Facebook. It compared a series of Eastern European countries (Bulgaria, Czech Republic, Lithuania, Poland and Slovakia) with the Netherlands and Sweden.

Pro Ukraine vs pro Russia - comparedUni Amsterdam / DMI Team

It found that overall the attention of the war has declined according to the engagement with war-related content on Facebook in each language. It also found that certain countries (particularly the Netherlands) exhibit more engagement with (hyper) partisan content than others. Finally, when examining the posts within which exclusively pro-Ukrainian or pro-Russian keywords appear, certain countries engage more with one side than the other, e.g., the Netherlands (with more pro-Ukrainian) and Bulgaria (with more pro-Russian).

For both sets of research questions we created pro-Russian, pro-Ukrainian and generic war keyword lists. Query Meta’s CrowdTangle for the keywords in Bulgarian, Czech, Dutch, Lithuanian, Poland, Slovakian and Swedish. The team used pro-Ukrainian and pro-Russian keywords (together) as proxies for (hyper)partisan attention and generic keywords for general attention. We then used the pro-Ukrainian and pro-Russian keywords (separately) as indicators of sentiment. In the end we plotted the (hyper)partisan attention to generic attention over time and pro-Ukrainian versus pro-Russian sentiment per country.

The conversation around the Russo-Ukrainian conflict in China

The situation on Douyin, China: snapshotUni Amsterdam / DMI Team

This research focused on the distribution of information about the Russia-Ukraine conflict through Douyin, the online short-video platform also known as the ‘Chinese TikTok’. The study analysed the public perception of the conflict as viewed through the social media platform Douyin. In order to study Douyin, we formulated a list of pro-Russian and pro-Ukrainian keywords and queried them via the search function. We subsequently scrapped the returns, some 1,000 in total. We then compared the videos containing the pro-Russian and pro-Ukrainian hashtags, and found that the engagement with pro-Russian videos is much greater than pro-Ukrainian. Meanwhile, the term “Russian-Ukraine conflict” is mentioned far more often than “Russian-Ukraine war”. Finally, the key phrase 'counter hegemony' has emerged in the videos with pro-Russia hashtags.

The above led us to conclude the following:

  1. At the data collection stage, it was challenging to find obvious pro-Ukrainian hashtags in Douyin.
  2. The world clouds that we plotted to analyse the data showed that the engagement with pro-Russian videos is indeed much greater than with pro-Ukrainian ones. Even when, eventually, we found videos with pro-Ukrainian hashtags most videos concerning the topic were still pro-Russian.
  3. Out of all the pro-Ukrainian hashtags, the ones with clear, positive sentiments (such as “Come on Ukraine”) paled in comparison with the significance of the pro-Russian hashtags.
  4. We saw a key phrase “counter hegemony” emerge in a video with pro-Russian content (and hashtags). We observed different understandings of “world peace” as presented in videos with pro-Russian and pro-Ukrainian hashtags. The pro-Ukraine videos presented “world peace” as “stopping the war”, whereas pro-Russian videos presented “countering the hegemony” as a way of achieving and maintaining world peace.

Resources

  1. Query design for social media research, R. Rogers (available here)
  2. Natural language processing for digital methods research using 4CAT (available here)
  3. Creating Gephi’s nodes and edges (available here)
  4. Network analysis with Gephi (available here) 

 

Author(s): Kamila Koronska (University of Amsterdam) - based on the textual contributions and work of the entire team, namely Sam Bouman, Jinru Dong, Malin Holm, Szilvi Német, Xinwen Xu, Desislava Slavova, Jennie Williams, Maria Plichta, Aistė Meidutė, Kefeng Cao, Richard Rogers, Maria Lompe, Emillie de Keulenaar, Kamila Koronska 

Editor: Jochen Spangenberg (DW)

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.
To provide you with relevant content we use essential cookies to monitor the usage of this website. To display external content, you need to accept all cookies. You can always adjust your settings via the Privacy Statement.
I need more info