Mon. Sep 16th, 2024
typewriter, hate messages, hatred
Today, the European Commission released the results of its seventh evaluation of the Code of Conduct on countering illegal hate speech online. Photo by viarami on Pixabay

Brussels, 24 November 2022

Today, the European Commission released the results of its seventh evaluation of the Code of Conduct on countering illegal hate speech online. This year’s results unfortunately show a decrease in companies’ notice-and-action results: the number of notifications reviewed by the companies within 24 hours dropped as compared to the last two monitoring exercises, from 90.4% in 2020, to 81% in 2021, and 64.4% in 2022. TikTok is the only company that improved its time of assessment. The removal rate, at 63.6%, is also considerably lower than at its peak in 2020 (71%). Only YouTube performed better on this parameter than in the last two years. There is, however, a positive development on the companies’ frequency and quality of feedback to users, something which the Commission had been calling on companies to improve in the 2021 report.

The seventh evaluation shows that:
  • Companies reviewed 64.4% of notifications within 24 hours, which shows a decrease as compared to the last two monitoring exercises (81% in 2021 and 90.4% in 2020). Only TikTok has increased its performance (from 82.5% in 2021 to 91.7% in 2022).
  • The removal rate was 63.6%, similar to 2021 (62.5%), but still lower than in 2020 (71%). YouTube improved its removal rate in 2022 (90.4%), as compared to 2021 (58.8%). All the other IT companies removed less content than in 2021, in some cases with minor variations (Facebook removed 69.1% in 2022 and 70.2% in 2021; Twitter removed 45.4% and 49.8%, respectively).
  • On average, 69.6% of content calling for murder or violence against specific groups was removed, while content using defamatory words or pictures to name certain groups was removed in 59.3% of the cases; showing a better response rate on the most serious manifestations of online hatred.
  • IT companies’ feedback to users improved in 2022 with respect to 2021. Many companies have done better, in particular TikTok (74.8% of notifications addressed, compared to 28.7% in 2021) and Instagram (72.6%, compared to 41.9% in 2021 and 62.4% in 2020).

To support the implementation of the Code of Conduct and address the gaps in notice-and-action, the IT companies and the network of trusted flagger organisations involved in the monitoring exercises have now agreed on an action framework. It lays down cooperation initiatives between the parties, where they commit to strengthening their dialogue to counter hate speech online.

Next steps

The Commission will continue monitoring the implementation of the Code of Conduct. The Commission will support IT companies and trusted flagger organisations in the implementation of the action framework agreed in the context of the Code of Conduct. The Digital Services Act (DSA) entered into force on 16 November. The Act provides comprehensive rules for platforms’ responsibilities and it will also further support co-regulatory frameworks. The Commission will discuss with the IT companies how to ensure that the implementation of the Code supports compliance with the DSA and adds value in the specific areas of tackling hate speech and protecting freedom of expression online. This process may lead to a revision of the Code of Conduct in the course of 2023.

Background

The Framework Decision on Combating Racism and Xenophobia criminalises public incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to race, colour, religion, descent or national or ethnic origin. As defined in this Framework Decision, hate speech is a criminal offence also when it occurs online.

In order to respond to the proliferation of racist and xenophobic hate speech online, the European Commission and four major IT companies (Facebook, Microsoft, Twitter and YouTube) presented a Code of Conduct on countering illegal hate speech online on 31 May 2016. Since then, Instagram, Snapchat, Dailymotion, Jeuxvideo.com, TikTok, LinkedIn and, in spring 2022, Rakuten Viber and Twitch joined the Code.

The Code of Conduct is based on close cooperation between the European Commission, IT companies, civil society organisations (CSOs) and national authorities. All stakeholders meet regularly under the umbrella of the High Level Group on combatting hate speech and hate crime, to discuss challenges and progress.

Each monitoring exercise was carried out following a commonly agreed methodology which makes it possible to compare the results over time. The seventh exercise was carried out between 28 March and 13 May 2022 by 36 organisations from 21 Member States. A total of 3634 notifications were submitted to the IT companies. 2765 notifications were submitted through the reporting channels available to general users, while 869 were submitted through specific channels available only to trusted flaggers/reporters.

On 9 December 2021, the Commission presented an initiative to extend the list of ‘EU crimes’ to hate speech and hate crime, as currently there is no legal basis to criminalise hate speech and hate crime at EU level. The existing list of EU crimes in the Treaty on the Functioning of the European Union (TFEU) needs to be extended to ensure minimum common rules on how to define criminal offences and sanctions applicable in all EU Member States.

On 9 March 2022 , the Commission also proposed EU-wide rules to combat violence against women and domestic violence, including online. The proposal aims to ensure that the most serious forms of violence against women are criminalised across the EU, such as rape, female genital mutilation and gender-based cyber violence, including cyber stalking, cyber harassment and non-consensual sharing of intimate images. Victims of cyber violence will also be entitled to adequate support, including advice on how to seek legal help and how to remove online content.

The Digital Services Act includes rules for online intermediary services, which millions of Europeans use every day. The obligations of different online players match their role, size and impact in the online ecosystem. Building on the experience from the Code and its monitoring exercise, obligations related to clear notice and action systems, priority treatment of notices from trusted flaggers, feedback on notices to users and extensive transparency obligations seek to address the identified shortcomings. Specific rules are laid down for very large online platforms reaching more than 45 million users in Europe. These platforms with a systemic role have to assess the risks their systems pose and take mitigating measures to curb the dissemination of illegal content and address other societal risks such as negative effects on fundamental rights or the spread of disinformation. The performance under the Code sets a benchmark for how such platforms tackle illegal hate speech.

For more information

7th evaluation – factsheet

Information provided by the IT companies about measures taken to counter hate speech  

Annex to the Code – Joint statement by trusted flagger organisations and IT companies for an action framework on enhanced cooperation

Quote
Source – EU Commission

Forward to your friends