On Tuesday, the Civil Liberties Committee adopted its position on new measures to protect children online by preventing and stopping child sexual abuse.
The draft Parliament position was adopted by the Committee on Civil Liberties, Justice and Home Affairs with 51 votes in favour, 2 against, and 1 abstaining. Inter-institutional negotiations were authorised with 48 in favour, 2 against, and 4 abstaining.
To protect children online, the new rules would mandate internet providers to assess whether there is a significant risk of their services being misused for online child sexual abuse and to solicit children, and to take measures to mitigate these risks. MEPs want mitigation measures to be targeted, proportionate and effective, and providers should be able to decide which ones to use. They also want to ensure that pornographic sites have adequate age verification systems, flagging mechanisms for child sexual abuse material (CSAM) and human content moderation to process these reports.
To stop minors being solicited online, MEPs propose that services targeting children should require by default user consent for unsolicited messages, have blocking and muting options, and boost parental controls.
Detection orders
To avoid mass surveillance or generalised monitoring of the internet, the draft law would allow judicial authorities to authorise time-limited orders, as a last resort, to detect any CSAM and take it down or disable access to it, when mitigation measures are not effective in taking it down.
In addition, MEPs emphasise the need to target detection orders to individuals or groups (including subscribers to a channel) linked to child sexual abuse using “reasonable grounds of suspicion”.
In the adopted text, MEPs excluded end-to-end encryption from the scope of the detection orders to guarantee that all users’ communications are secure and confidential. Providers would be able to choose which technologies to use as long as they comply with the strong safeguards foreseen in the law, and subject to an independent, public audit of these technologies.
EU Centre for Child Protection
The law would set up an EU Centre for Child Protection to help implement the new rules and support internet providers in detecting CSAM. It would collect, filter and distribute CSAM reports to competent national authorities and Europol. The Centre would develop detection technologies for providers and maintain a database of hashes and other technical indicators of CSAM identified by national authorities.
The Centre would also support national authorities as they enforce the new child sexual abuse rulebook, conduct investigations and levy fines of up to 6% of worldwide turnover for non-compliance.
MEPs finally propose to create a new Victim’s Rights and Survivors Consultative Forum to make sure that victims’ voices are heard.
Quote
Rapporteur Javier Zarzalejos (EPP, Spain) said:
“To meet this compelling challenge effectively, we have found a legally sound compromise supported by all political groups. It will create uniform rules to fight the sexual abuse of children online, meaning that all providers will have to assess if there is a risk of abuse in their services and mitigate those with tailor-made measures. As a last resort, detection orders can be used to take down abusive material still circulating on the internet. This agreement strikes a balance between protecting children and protecting privacy.”
Next steps
The draft Parliament position still needs to be endorsed by the plenary. On 20 November, the start of negotiations will be announced, and MEPs have until the end of the following day to object. If a sufficient number choose to do so, there will be a vote during the same session.
MEPs support new rules to combat child sexual abuse online
Brussels, 14 November 2023
MEPs in the civil liberties, justice and home affairs committee (LIBE) in the European Parliament have voted on new rules to combat child sexual abuse online.
Following the Commission’s proposal for a Regulation laying down rules to prevent and combat child sexual abuse in 2022, political groups have reached a compromise that would set up a legal framework to remove and stop the dissemination of child sexual abuse material online, while safeguarding fundamental rights and keeping end-to-end encryption of communications protected.
The result of the vote was 51 for, one abstention and two against. The European Parliament is expected to endorse the LIBE committee’s position during next week’s plenary. When the Council reaches a common position, inter-institutional negotiations can then begin.
Representing the S&D Group in the negotiations, Paul Tang MEP said:
“Today’s vote in the European Parliament is an important step forward in making the online world safer for children. The digital playground as it stands is not safe. Children are exposed to risks and dangers that can cause deep and lasting damage.
“In the European Parliament, we have managed to strike a balance between protecting fundamental rights and protecting children by stopping grooming and dissemination of child sexual abuse material. We urge national governments to look at our example as a way forward to unblock negotiations in the Council.”
Source – S&D Group Press Release – Email