Deliverables

Deliverables come in various guises. They can be project reports, demonstrators, prototypes and such like. All but two Deliverables of the vera.ai project are public, meaning they will be made available to an interested audience. 

Project deliverables are listed per workpackage below, also giving an indication when they are due contractually (this does necessarily mean they will also be made available at those dates). Once they are ready for sharing (and in case they are public), they are provided and accessible below as well as further down on this page in “direct view format” (this time in chronological order).

A final note on labelling / timing: as the project started on 15 Sep 2022, e.g. M6 means month 6 after project start – in other words 14 March 2023, and so on.

Overview of ALL Deliverables and dates at which they are formally due - listed per workpackage

D1.1 Data Management Plan (report, public, due M6). Detailing how data in the project will be managed. Updated version (v2.0) of Data Management Plan. 

D2.1 AI against Disinformation: Use Cases and Requirements (report, sensitive / restricted, due M9). Reports on the methodology, co-creation activities and the resulting use cases and requirements.

D2.2 Evaluation Report (report, public, due M36). Reports on evaluation methodology and consolidated results.

D3.1 Explainable AI methods for analysis and verification of text, audio, image & video misinformation (other / report, public, due M18) on methods developed and experiments carried out until M18, accompanied by reference software implementations and services.

D3.2 Multimodal deepfake and manipulation analysis methods (other / report, public, due M32) on methods, software services, and performance evaluation report on RTD activities in related tasks.

D3.3 Cross-modal and user feedback enhanced verification tools (other / report, public, due M34) on methods, software services, and performance evaluation report on RTD activities in related tasks.

D4.1 Cross-lingual and multimodal near-duplicate search methods (other / report, public, due M 18). Methods developed and experiments carried out in related tasks, including reference software implementations and services.

D4.2 Coordinated sharing behaviour detection and disinformation campaign modelling methods (other / report, public, due M34). Methods, evaluation, and software services on activities in related tasks.

D4.3. Disinformation Impact and Platform Algorithm Assessments (other / report, public, due M36). Methods developed and experiments carried out in related tasks

D5.1 Annotation model, API definitions, and Database of Known Fakes first release (report / prototype, public, due M12). Documentation and code from related tasks and first release of the Database of Known Fakes (DBKF).

D5.2 AI-enhanced Verification Tools for Professionals v1 and DBKF interim release (demonstrators, public, due M18). First release of the enhanced verification plugin, Truly Media and the verification assistant, and interim release of the Database of Known Fakes (DBKF).

D5.3 AI-enhanced Verification Tools for Professionals/DBKF final release and integration with AI platforms (demonstrators, public, due M36). Final public release of the professional-oriented tools and integration results with relevant AI platforms.

D6.1 Plan for dissemination, communication, clustering & exploitation activities (dissemination / exploitation / outreach activities (DEC) and report, public, due M6–website launch due M3). Outline of dissemination, exploitation and clustering activities plan / strategy, and report on activities to date.

D6.2 Dissemination & Outreach Activities Report (dissemination outreach activities (DEC) and report, public, due M36) outlining respective activities undertaken in course of the project.

D6.3 Exploitation and Sustainability Report (sensitive / restricted, report, due M36) outlining respective activities undertaken in course of the project.

PUBLIC Deliverables (in order of them being made available)

D6.1 Plan for dissemination, communication, clustering & exploitation activities. Outline of dissemination, exploitation and clustering activities plan / strategy, and report on activities to date.

N.B.: draft version, subject to approval. Made available on 13 March 2023.

D1.1 Data Management Plan. Detailing how data in the project will be managed.

N.B.: draft version, subject to approval. Made available on 14 March 2023.

D5.1 Annotation model, API definitions, and Database of Known Fakes first release (report / prototype, public, due M12). Documentation and code from related tasks and first release of the Database of Known Fakes (DBKF).

N.B.: draft version, subject to approval. Made available on 21 December 2023.

D1.1 Data Management Plan (UPDATE). Detailing how data in the project will be managed.

N.B.: draft version, subject to approval. Made available on 21 December 2023.

D4.1 Cross-lingual and multimodal near-duplicate search methods. Methods developed and experiments carried out in related tasks, including reference software implementations and services.

N.B.: draft version, subject to approval. Made available on 14 March 2024.

D5.2 AI-enhanced Verification Tools for Professionals v1 and DBKF interim release. First release of the enhanced verification plugin, Truly Media and the verification assistant, and interim release of the Database of Known Fakes (DBKF).

N.B.: draft version, subject to approval. Made available on 14 March 2024.

D3.1 Explainable AI methods for analysis and verification of text, audio, image & video misinformation on methods developed and experiments carried out until M18, accompanied by reference software implementations and services.

N.B.: draft version, subject to approval. Made available on 21 March 2024.

vera.ai is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities. This website reflects the views of the vera.ai consortium and respective contributors. The EU cannot be held responsible for any use which may be made of the information contained herein.
To provide you with relevant content we use essential cookies to monitor the usage of this website. To display external content, you need to accept all cookies. You can always adjust your settings via the Privacy Statement.
I need more info