Thu. Nov 14th, 2024

Europe has announced that it will be stepping up its efforts in the fight against online disinformation and publish guidance to strengthen the Code of Practice on disinformation. We call on the Commission to enshrine the following CONFIDENCE building measures in the upcoming Guidance. This is an opportunity for the EU to deliver high standards, effective monitoring and adaptation to new threats and for platforms to show that they are committed to protecting and empowering European citizens. All within transparent, verifiable and future-proof frameworks.

  1. Clear and strong commitments/KPIs – developed, monitored and regularly updated in accordance with EC principles for better self- and co-regulation   to enable effective measurement of platforms’ achievements.
  2. Openness/Transparency. Availability, useability, neutrality and verifiability of data necessary to assess KPIs, algorithms and moderation activities for access, monitoring and analysis initiated by regulators, researchers, journalists, fact-checkers. A mechanism to ensure that entities denied or partially granted access by the Platforms can notify/ have recourse via their national Regulator. The Code should also establish a permanent dialogue allowing regulators, researchers, journalists, fact-checkers and other relevant stakeholders to bring-up issues and requests to Signatories.
  3. No measures interfering with editorial control and media content integrity. As a rule, lawful uploads of content which emanate from editorial media cannot be considered as disinformation. Signatories should not interfere with content that is already under a media provider’s editorial control and subject to specific standards/media regulation and independent oversight. Signatories should not alter or remove media content to avoid secondary editorial decisions with regards to legitimate content in full respect of editorial decisions and media freedom.
  4. Futureproof. Annual review process in conformity with the Better Regulation objectives setting out an iterative process to fix gaps, address emerging threats, incorporate new platform designs impacting standalone/cross-platform commitments. For the review, a scoreboard of stand alone and cross platform indicators should aggregate key indicators allowing to measure progress.
  5. Independent expertise, audits and monitoring to ensure verification of reporting and also in the definition of relevant data sets and formats.
  6. Demonetisation. Fixing the problem and financing the solution. Clear view on flows of money going to inauthentic accounts and websites, transparency on amounts refunded, consistent information on advertising taken down and efficacy of systems and policies in place.
  7. Encompassing investments and workforce. Commitments and KPIs on direct workforce, contracted parties and investments in systems for better visibility on commitments made in terms of FTEs (vs contractors) and infrastructure. This data, and other data relevant to KPIs, should be available at Member State level.
  8. Near perfect information on advertising placement for legitimate advertisers, websites, accounts and other online advertising market participants and end-users. The commitments, associated KPIs, and cross-platform information provided should allow for greater market transparency, common reporting/inputting frameworks to trace across platforms, and in no case serve as a means for Platforms to discriminate access to media.
  9. Cross-platform and stand-alone KPIs to measure the scale of the problem and efforts carried out within and between the Platforms, including investments and actions undertaken by platforms to tackle disinformation (eg. labelling, suspension, amplification, demotion, demonetisation) at a member state level of analysis.
  10. Enforcement. Involvement of competent bodies in the monitoring and enforcement of the CoP; including the development of tools to empower  the enforcement and oversight of the Signatories’ proper application of the Code.
Forward to your friends