Report: Visual assessment of Coordinated Inauthentic Behaviour in disinformation campaigns

The EU DisinfoLab has authored a report under the EU-funded project vera.ai entitled "Visual assessment of Coordinated Inauthentic Behaviour in disinformation campaigns" (as part of T4.3). Read a summarising introduction below and find the document at the end of this page.

The present report examines three disinformation, manipulation, and interference campaigns: 'Operation Overload' (by CheckFirst), 'A massive Russian influence operation on TikTok' (by DFRLab and BBC Verify), and 'QAnon’s ‘Save the Children’ campaign' (by multiple stakeholders). Through a structured visual approach, it assesses how these cases align with key indicators of Coordinated Inauthentic Behaviour (CIB).

To determine whether a campaign exhibits CIB, the report applies a set of 50 CIB indicators based on EU DisinfoLab’s previous work, Revisit the Coordinated Inauthentic Behaviour Detection Tree. The presence of these indicators is evaluated and quantified to generate a probability score for CIB, ranging from 0 to 100 %.

Additionally, the findings are visually represented using colour-coded gauges across five key dimensions: Coordination, Authenticity, Source, Impact, and Final Assessment. This approach aims to enhance understanding of CIB tactics and provide the defender community with practical tools to detect and mitigate such activities.

The paper is accessible here (see below) and on the EU DisinfoLab website.

Author: Ana Romero (EUDL)

Editor: Anna Schild (DW)

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.