Thu. Sep 19th, 2024

Brussels, 11 May 2022

Today, the Commission is proposing new EU legislation to prevent and combat child sexual abuse online. With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive. The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64% increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires. Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform.

To effectively address the misuse of online services for the purposes of child sexual abuse, clear rules are needed, with robust conditions and safeguards. The proposed rules will oblige providers to detect, report and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.

A new independent EU Centre on Child Sexual Abuse (EU Centre) will facilitate the efforts of service providers by acting as a hub of expertise, providing reliable information on identified material, receiving and analysing reports from providers to identify erroneous reports and prevent them from reaching law enforcement, swiftly forwarding relevant reports for law enforcement action and by providing support to victims.

The new rules will help rescue children from further abuse, prevent material from reappearing online, and bring offenders to justice. Those rules will include:

  • Mandatory risk assessment and risk mitigation measures:Providers of hosting or interpersonal communication services will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.
  • Targeted detection obligations, based on a detection order:Member States will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new child sexual abuse material or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.
  • Strong safeguards on detection: Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.
  • Clear reporting obligations: Providers that have detected online child sexual abuse will have to report it to the EU Centre.
  • Effective removal: National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.
  • Reducing exposure to grooming: The rules require app stores to ensure that children cannot download apps that may expose them to a high risk of solicitation of children.
  • Solid oversight mechanisms and judicial redress: Detection orders will be issued by courts or independent national authorities. To minimise the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.

The new EU Centre will support:

  • Online service providers, in particular in complying with their new obligations to carry out risk assessments, detect, report, remove and disable access to child sexual abuse online, by providing indicators to detect child sexual abuse and receiving the reports from the providers;
  • National law enforcement and Europol, by reviewing the reports from the providers to ensure that they are not submitted in error, and channelling them quickly to law enforcement. This will help rescue children from situations of abuse and bring perpetrators to justice.
  • Member States, by serving as a knowledge hub for best practices on prevention and assistance to victims, fostering an evidence-based approach.
  • Victims, by helping them to take down the materials depicting their abuse.

Together with today’s proposal, the Commission is also putting forward a European strategy for a better internet for kids.

Next steps

It is now for the European Parliament and the Council to agree on the proposal.

Once adopted, the new Regulation will replace the current interim Regulation.

Members of the College said

Vice-President for Democracy and Demography, Dubravka Šuica, said:

“Upholding and protecting children’s rights online as well as offline is essential to the well-being of our societies. Online child sexual abuse material is a product of the manifested physical sexual abuse of children. It is highly criminal. Online child sexual abuse has wide-ranging, long-term consequences for children and leave a deep trauma. Some may, and do, never recover. Child sexual abuse is preventable if we work together to protect children. We do not allow child sexual abuse offline, so we must not allow it online.”

Vice-President for Promoting our European Way of Life, Margaritis Schinas, said:

“The sheer amount of child sexual abuse material circulating on the web is dumbfounding. And shamefully, Europe is the global hub for most of this material. So it is really very much a question of if we do not act then who will? The rules we are proposing set clear, targeted and proportionate obligations for service providers to detect and remove illegal child sexual abuse content. What services will be allowed to do will be very tightly ringfenced with strong safeguards in place – we are only talking about a programme scanning for markers of illegal content in the same way cybersecurity programmes run constant checks for security breaches.”

EU Commissioner for Home Affairs, Ylva Johansson, said:

“As adults, it is our duty to protect children. Child sexual abuse is a real and growing danger: not only is the number of reports growing, but these reports today concern younger children. These reports are instrumental to starting investigations and rescuing children from ongoing abuse in real time. For example a Europol-supported investigation based on a report from an online service provider led to saving 146 children worldwide with over 100 suspects identified across the EU. Detection, reporting and removal of child sexual abuse online is also urgently needed to prevent the  sharing of images and videos of the sexual abuse of children, which retraumatises the victims often years after the sexual abuse has ended. Today’s proposal sets clear obligations for companies to detect and report the abuse of children, with strong safeguards guaranteeing privacy of all, including children.”

Background

The fight against child sexual abuse is a priority for the Commission. Nowadays, photos and videos of children being sexually abused are shared online on a massive scale. In 2021, there were 29 million reports submitted to the US National Centre for Missing and Exploited Children.

In the absence of harmonised rules at EU level, social media platforms, gaming services, other hosting and online service providers face divergent rules. Certain providers voluntarily use technology to detect, report and remove child sexual abuse material on their services. Measures taken, however, vary widely and voluntary action has proven insufficient to address the issue. This proposal builds on the Digital Services Act and complements it with provisions to address the specific challenges posed by child sexual abuse online.

Today’s proposal follows from the July 2020 EU strategy for a More Effective Fight Against Child Sexual Abuse, which set out a comprehensive response to the growing threat of child sexual abuse both offline and online, by improving prevention, investigation and assistance to victims. It also comes after the Commission presented its March EU Strategy on the Rights of the Child, which proposed reinforced measures to protect children against all forms of violence, including abuse online.

For More Information

Q&A: New rules to fight child sexual abuse

Factsheet

Proposalfor a Regulation laying down rules to prevent and combat child sexual abuse

Website

 


Questions and Answers: New rules to fight child sexual abuse

11 May 2022

 

Why are new rules needed?

The internet has proven to be a great connector, including for children and especially throughout the pandemic. However, children may also be exposed to risks online, including when it comes to child sexual abuse. The past years have seen an overwhelming increase of sexual abuse online, both in sharing child sexual abuse material online, and in solicitation of children into sexually abusing themselves or even meeting perpetrators offline. According to Europol’s analysis, in the first months of the COVID-19 crisis the demand for child sexual abuse material increased by up to 25% in some Member States. The US National Centre for Missing and Exploited Children (NCMEC) also found that reports containing instances of child sexual abuse globally increased substantially, with NCMEC receiving almost 30 million reports of suspected child sexual exploitation in 2021, and law enforcement being alerted to over 4000 new child victims. Reports of children subjected to grooming behaviour increased by more than 16% from 2020 to 2021. The circulation of pictures or videos picturing abuse among offenders re-victimises children and makes it difficult for them to find closure.

Currently, certain online service providers detect online child sexual abuse on a voluntary basis. US service providers in fact supply the majority of reports that reach law enforcement, with NCMEC forwarding EU related reports to Europol and national law enforcement.

While the measures taken by providers make an important contribution, they vary widely, with the vast majority of reports coming from a handful of providers, while a significant number take no action. Up to 95% of all reports of child sexual abuse received in 2020 came from one company, despite clear evidence that the problem does not only exist on one platform alone.

Voluntary action is therefore insufficient to effectively address the misuse of online services for the purposes of child sexual abuse. A clear and binding legal framework is needed, with clear safeguards, to give providers legal certainty and ensure full respect for fundamental rights.

Obliging service providers where necessary to detect, report and remove child sexual abuse will help rescue children from further abuse, prevent material from reappearing, and identify and prosecute offenders.

What are the key elements of the proposal?

The proposal provides a uniform approach to detecting and reporting child sexual abuse, supports the work of public authorities and seeks to boost EU efforts on prevention and assistance to victims.

It will:

  • Impose obligations on service providers to prevent child sexual abuse online by assessing and mitigating risks and, where needed, adopt targeted orders to detect, report and remove online child sexual abuse: The proposed rules introduce an obligation for relevant online service providers to assess the risk of their services’ misuse for the dissemination of child sexual abuse materials or for the solicitation of children (“grooming”). Member States will need to designate national authorities in charge of reviewing the risk assessment and the mitigating measures proposed by the service provider to prevent child sexual abuse online. Where such authorities determine that a significant risk remains, they can ask a court or an independent administrative authority to issue a detection order for known or new child sexual abuse material or grooming to address any remaining significant risk in a targeted manner. Detection orders are limited in time, subject to strict procedural safeguards, and target a specific type of offence on a specific service. The intervention of data protection authorities is strengthened, building on the General Data Protection Regulation.
  • Introduce strong safeguards on detection: Companies having received a detection order will only be able to detect content using indicators to identify child sexual abuse, provided by the EU Centre, that have been created based on child sexual abuse online previously identified by relevant independent authorities or a court in the Member States. It is therefore not left to the provider to determine what is illegal in the EU. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.
  • Create a new EU Agency to prevent and combat child sexual abuse: The EU Centre to prevent and combat child sexual abuse will maintain a database of indicators allowing the reliable identification of child sexual abuse materials and of solicitation of children as defined by EU rules. The EU Centre will also receive and process reports from providers of any child sexual abuse materials or solicitation of children detected on their services, and will share them with the competent law enforcement authorities and Europol, unless they are submitted in error. It will function as an important safeguard by preventing false positives from being reported to law enforcement, ensuring visibility on the effectiveness of detection measures, transparency and accountability of the process.
Who will the new rules apply to?

The proposed rules will apply to online service providers offering services in the EU, namely hosting services and interpersonal communication services (such as messaging services), app stores and internet access providers. The new obligations will be targeted to the types of service providers whose services are most misused for child sexual abuse online, and will first and foremost aim to create incentives for stronger child protection.

These services have an important role to play in fighting child sexual abuse, as they are often the only ones to have any possibility to detect ongoing abuse. Frequently, abuse is only discoverable thanks to the efforts of online service providers to detect child sexual abuse material on their services, and protect children from being approached by offenders online. This is particularly the case in electronic (private individual or group) communications, which offenders frequently use to exchange material and approach children.

The internet has also given offenders a new way of approaching children. They contact children on social media, gaming platforms and chats and lure them into producing compromising images of themselves or into offline meetings. In 2021, there was a three-fold increase in “self-generated” imagery showing 7-10 year olds, and NCMEC reported that online enticement is exponentially on the rise. Service providers are the first actors with the possibility to counter this crisis, taking into account the facilitation of global sharing of materials and the creation of specific networks of offenders who share resources and strategies to best target and solicit children.

To facilitate enforcement, providers of hosting or interpersonal communication services not established in any EU Member State, but offering their services in the EU, will be required to designate a legal representative in the EU.

What material is covered under the proposal?

The detection obligations cover known material (re-uploaded photos and videos that have been previously identified as child sexual abuse material), new material (photos and videos not previously identified), and grooming (a practice where child sexual abuse offenders build a relationship of trust and emotional connection with children in order to manipulate and sexually exploit and abuse them).

In line with the central objective of the proposal to better protect children, the identification of grooming only concerns interpersonal communications where it is known that one of the users is a child. This only occurs where the risk assessment has indicated a significant risk of misuse of the service for the purpose of online child sexual abuse, notwithstanding the mitigation measures taken by the provider.

Does the proposal cover encrypted material?

The proposed obligations on service providers as regards the detection of child sexual abuse material are technologically neutral, meaning they do not prescribe which technology should be used for detection. It is an obligation of result not of means, leaving to the provider the choice of technology to be operated, subject to its compliance with strict safeguards.

This includes the use of encryption technology. Encryption is an important tool for the protection of cybersecurity and confidentiality of communications. At the same time, its use as a secure channel could be abused of by criminals to hide their actions, thereby impeding efforts to bring perpetrators of child sexual abuse to justice.

A large portion of reports of child sexual abuse, which are instrumental to starting investigations and rescuing children, come from services that are already encrypted or may become encrypted in the future. If such services were to be exempt from requirements to protect children and to take action against the circulation of child sexual abuse images and videos via their services, the consequences would be severe for children. NCMEC estimates that more than half of its CyberTipline reports will vanish with end-to-end encryption, leaving abuse undetected, unless providers take measures to protect children and their privacy also on end-to-end encrypted services. Analyses show this would be an estimated loss of 2100 reports per day, reports that could have led to the rescue of children from ongoing abuse and the prevention of further abuses by offenders.

The Commission works closely with industry, civil society organisations, and academia in the context of the EU Internet Forum, to support research that identifies technical solutions to scale up and feasibly and lawfully be implemented by companies to detect child sexual abuse in end-to-end encrypted electronic communications in full respect of fundamental rights.

The proposed legislation takes into account recommendations made under a separate, ongoing multi-stakeholder process exclusively focused on encryption arising from the December 2020 Council Resolution. This work has shown that solutions exist, but have not been tested on a wide scale basis. The Commission will continue to work with all relevant stakeholders to address regulatory and operational challenges and opportunities in the fight against these crimes.

What are the obligations of the service providers under these new rules?

The new rules set out obligations to assess and mitigate risks, and, where necessary, to detect, report and remove child sexual abuse online, including known and new images and videos, as well as cases of grooming.

Providers of hosting or interpersonal communication services will be obliged to conduct a risk assessment, in which they will evaluate the likelihood that the service could be used for the purpose of online child sexual abuse, and the mitigating measures they take to reduce any risk identified and hence to prevent child sexual abuse online on their services.

Based on this assessment, where the risk remains significant despite mitigating measures, the relevant national authorities may issue a detection order. Companies will be obliged to use the indicators (hashes/AI classifiers) provided by the EU Centre in their detection efforts. Detection orders are issued when a service (or the part of the service where it is possible to perform detection separately) is likely to be used for the purpose of online child sexual abuse after taking into account the mitigation measures taken by the provider. Once the order is issued, companies will be obliged to detect child sexual abuse on their services.

Reports of any child sexual abuse online that is detected are sent to the new EU Centre, which will check the material to eliminate erroneous reports, and if necessary forward it to law enforcement and Europol.

App stores will be required to take measures to limit the risk of children downloading apps that may expose them to a high risk of grooming, in cooperation with the providers of those apps.

Internet access providers will be obliged to disable access to images and videos that cannot be taken down, e.g. because they are hosted outside of the EU in non-cooperative jurisdictions.

How will the proposal prevent mass surveillance?

What service providers will be able to do under this legislation will be very tightly ringfenced both before and after a detection order is issued.

Firstly, detection orders are limited to situations where preventive measures do not suffice to limit the risk.

Secondly, the process for issuing a detection order is very thorough and designed to ensure the measures are limited to what is strictly necessary:

  • The proposal is built in concentric circles, narrowing down the scope of application of any obligation to detect at each step. First, it only applies to two individually identified types of providers: hosting services and publicly available interpersonal communications services.
  • Both services are required to carry out thorough risk assessments and take steps to mitigate any risks identified.
  • National authorities check these against the criteria specified, which rise in strictness along with the degree of interference. Only where the authorities are of the opinion that there is evidence of a significant risk of misuse, and that the reasons for issuing the detection order outweigh negative consequences for the rights and legitimate interests of all parties affected, having regard in particular to the need to ensure a fair balance between the fundamental rights of those parties, would they announce their intent to consider a detection order.
  • Before any order is issued, the provider is consulted. If the authorities still find there is a risk, the provider is asked to state how it would implement detection. Where this involves high-risk processing, or in every case relating to the detection of grooming, the provider must conduct a data protection impact assessment and consult the data protection authorities.
  • Only if national authorities then confirm a third time that a significant risk persists, would they be able to request an order from another independent authority or court. The independent authority or court would again reassess the case in view of all opinions and expertise submitted, including that of the data protection authorities.
  • This iterative, triple layered, process ensures the measures are limited to the greatest extent possible to ensure they are strictly necessary.

Thirdly, once an order is issued, the legislation sets an obligation of results, not of means: companies must comply with the detection obligations but are free to choose the technology for the online exchanges that best fits its services.

Whilst this includes encryption, the proposal contains a strong set of safeguards on the detection technologies used.

When issuing detection orders, national authorities have to take into account the availability and suitability of relevant technologies. This means that the detection order will not be issued if the state of development of the technology is such that there is no available technology that would allow the provider to comply with the detection order.

Detection is performed automatically and anonymously, through state of the art technologies that ensure the highest possible effectiveness and minimise the impact on the right to privacy of users.

Detection can only be based on the set of indicators of online child sexual abuse kept by the EU Centre under the control of national law enforcement authorities (they are the ones who confirm that a certain item is child sexual abuse).

Human review only intervenes when indicators point to online child sexual abuse in a specific image, video or bit of conversation (in case of solicitation).

When human review occurs, it is first performed at the level of the Centre, so that obvious false positive are not transmitted to law enforcement. The Centre also provides feedback to the provider, to allow for improvement of the detection tools over time.

Finally, the proposal specifies that both providers and users have a right to challenge any measure affecting them in Court.

In addition to those conditions, the European Data Protection Board is consulted on all the technologies to be included on the EU Centre’s list. The European Data Protection Board is also consulted on the ways in which such technologies should be best deployed to ensure compliance with applicable EU rules on the protection of personal data.

All of this means that the scope for issuing detection orders and for carrying out detection is very narrow and limited to what is strictly necessary, whilst avoiding any abuse of detection powers.

What safeguards will service providers need to take into account in their detection efforts?

The proposed rules create a balanced and targeted system, limited to what is strictly necessary and proportionate to address the misuse of relevant information society services for online child sexual abuse.

Following the risk assessment performed by online service providers, only some of them will have to verify whether child sexual abuse material is present on their services.

The proposal requires that these providers deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.

Detection systems are only to be used for the intended purpose of detecting and reporting child sexual abuse.

The proposal provides for judicial redress, with both providers and users having a right to challenge any measure affecting them in Court. Users have a right of compensation for any damage that might result from processing under the proposal.

The proposal also sets out strong oversight mechanisms. This include requirements regarding the independence and powers of the national authorities charged with issuing the orders and overseeing their execution. In addition, the EU Centre to prevent and combat child sexual abuse will assess reports of potential online child sexual abuse made by providers, helping to minimise the risk of erroneous detection and reporting.

The European Data Protection Board (EDPB) and the national data protection supervisors will play a role in assessing detection technology, to ensure constant monitoring of compliance in respect of privacy and personal data.

Provisions are also made to ensure transparency, with service providers, national authorities and the EU Centre having to report every year on their activities under the proposed rules.

How will detection work and is it reliable?

Providers currently detecting child sexual abuse material on a voluntary basis typically use automated technology. Automated tools look for specific indicators of possible child sexual abuse, meaning they are designed to check whether specific content is likely to be child sexual abuse, but not what specific content is about.

Technologies for detection of known child abuse material are typically based on hashing, which creates a unique digital fingerprint of a specific image. Technologies currently used for the detection of new child abuse material include classifiers and artificial intelligence (AI). A classifier is any algorithm that sorts data into labelled classes, or categories of information, through pattern recognition. Technologies for the detection of grooming in text-based communications make use of analysis of text technologies and/or analysis of metadata. Human review is already typically in place even for the most accurate technologies such as hashing.

Among companies that currently voluntarily detect know child sexual abuse, the use of automatic detection is highly reliable and shows extremely low false positive rates.

High error rates (e.g. incorrectly flagging as child sexual abuse content that it is not), would be detected quickly under the proposed rules by the EU Centre, which will ensure that no false positives reach law enforcement. Companies are immediately notified when their tools are producing erroneous notifications, and are obliged to take steps to fix it.

Will service providers receive support to comply with these new obligations?

The EU Centre will help companies fulfill their obligations. It will provide the indicators to detect child sexual abuse, which will give providers certainty on what content is illegal in the EU.  The Centre will make available for free the technology to detect and conduct the human review of all the reports. This will alleviate the burden on the providers, especially smaller ones. Finally the Centre will give feedback on the accuracy of reporting, helping providers improve their processes.

Will service providers not complying with the obligations under the proposed rules be fined?

Each case will need to be assessed by the national authorities responsible. Member States will need to set out rules on effective, proportionate, and dissuasive penalties. When imposing sanctions, national authorities are asked to take into account the gravity, recurrence and duration of the breach – but also whether the breach was intentional or negligent, whether it was a first breach by the provider in question, as well as the size and financial strength of the provider and its willingness to cooperate with the authorities. Penalties should not exceed 6 % of the provider’s annual income or global turnover of the last business year.

What will be the role and powers of the EU Centre to prevent and counter child sexual abuse?

The Centre will have three major roles: supporting efforts on prevention, improving assistance to victims, and supporting the detection, reporting and removal of child sexual abuse online.

The Centre will act as a hub of expertise on prevention. This will include raising awareness among children and parents/guardians, as well as efforts aimed at people who fear they may offend against children.

The Centre will also pool expertise and research as regards support available to survivors. It will support victims in removing the materials depicting their abuse from the internet.

The Centre will also help providers implement their obligations as regards the detection, reporting and removal of child sexual abuse online. It will facilitate access to reliable detection technologies to providers and will maintain a database of child sexual abuse indicators (hashes, AI patterns/classifiers) that will reliably enable the detection of what is defined as child sexual abuse according to EU rules. It will share those indicators with companies, providing clear information on what is illegal in the EU, instead of leaving service providers to rely on US definitions. The proposal therefore creates a proactive system, bringing all relevant service providers together to take action, reversing the current reactive system in which EU law enforcement is dependent on US law and voluntary actions by companies. The Centre will receive the reports from providers, check them to avoid false positives and forward them to Europol as well as to national law enforcement authorities.

As the recipient of reports of child sexual abuse online the Centre will have an insight into the effectiveness of detection measures and will be able to ensure the transparency and accountability of the process.

When will the EU Centre to prevent and counter child sexual abuse be established?

Depending on the timeline for the adoption and implementation of the proposed Regulation, the Centre should start its work in 2024-2026. The Centre will start operations with a focus on prevention and assistance to victims. It is expected to be fully operational by 2030 as preparation for the new detection, reporting and removal process, including the creation of the database of child sexual abuse indicators, should by then be complete.

What is being done to prevent child sexual abuse from happening in the first place?

Prevention is key to combating child sexual abuse.

The existing EU rules on combating the sexual abuse of children (Directive 2011/93) require Member States to invest in prevention programmes for convicted offenders and people who fear they may offend, and to put in place prevention activities through education and awareness raising.

To help Member States implement these obligations fully and strengthen prevention, the Commission is creating a network of prevention experts, as announced in the July 2020 Strategy for a more effective fight against child sexual abuse. The network will foster communication and exchange of best practices between researchers and practitioners.

The Commission’s Joint Research Centre, in cooperation with the experts who may become part of the prevention network, has published a report presenting classification criteria for prevention programmes across the EU.

The EU is also working to improve the protection of children from sexual abuse globally by supporting and cooperating with the WeProtect Global Alliance; and by providing funding to projects on prevention though the Internal Security Fund.

What are the links with the new Better Internet for Kids strategy?

The Better Internet for Kids strategy will help implement EU child safety legislation, including strengthened provisions in the planned Digital Services Act and the proposed Regulation on preventing and combatting child sexual abuse and sexual exploitation.

Under the Better Internet for Kids strategy, the EU co-funds safer internet helplines and hotlines, and will continue to assist the public, in particular children, when confronted with harmful and illegal content including child sexual abuse material. If granted the status of ‘trusted flaggers’ under the conditions of the planned Digital Services Act, these helplines and hotlines will be able to contribute to a swifter assessment of and action upon notifications of illegal content online.

How does this proposal relate to the Digital Services Act?

The Digital Services Act will create a harmonised baseline for addressing all illegal content in general.

Throughout the Digital Services Act, child sexual abuse material and illegal content in general will be addressed in a reactive manner, on a case-by-case basis. The proposed Regulation on preventing and combatting child sexual abuse and sexual exploitation will complement the general framework to be established under the Digital Services Act with specific and targeted rules to prevent and tackle the dissemination and circulation of known child sexual abuse material.

For More Information

Press Release: Fighting child sexual abuse: Commission proposes new rules to protect children

Factsheet

Proposal for a Regulation laying down rules to prevent and combat child sexual abuse

Website

 

 

Forward to your friends