Service providers in focus of new compromise text of online child sexual abuse draft law

Content-Type:

News Based on facts, either observed and verified directly by the reporter, or reported and verified from knowledgeable sources.

[Ground Picture/Shutterstock]

The Belgian EU Council Presidency’s latest compromise text of the draft law on detecting and preventing online child sexual abuse material further clarifies risk categorisation thresholds and lays out service providers’ obligation for data retention.

The regulation, which aims to create a system for detecting and reporting online child sexual abuse (CSAM), has faced criticism for potentially allowing judicial authorities to request the scanning of private messages on platforms like WhatsApp or Gmail.

The latest compromise text, dated 9 April and seen by Euractiv, was sent by the Belgian EU Presidency to the Law Enforcement Working Party (LEWP), responsible for legislative and operational issues related to cross-border policing.

Service providers

Under the draft legislation, online service providers can voluntarily flag measures to the Coordinating Authority even when there is only a suspicion of their services being used for abuse, as opposed to having concrete evidence for such abuse. This may require issuing detection orders.

The new compromise text specifies that providers must record and retain hits during detection orders.

However, they are not required to report the data until hits related to potential new CSAM have been flagged twice, and hits related to attempts to solicit children have been flagged three times, all within the timeframe of the respective detection order.

Competent authorities are national juridical authorities, while the Coordinating Authority in each EU country oversees risk assessments and mitigation measures, as well as efforts to detect, report, and remove CSAM.

Detection orders mandate service providers to actively search for and report instances of CSAM or attempts to solicit children on their platforms.

Providers are instructed to voluntarily report only potential new instances of abuse for users with repeated behaviour, to ensure proportionality. Providers should record the first instance of such behaviour and preserve it during a detection order, deleting the records once the order expires.

In this part of the regulation, the new CSAM contrasts with previous references to known CSAM. In this context “known” material refers to content that has already been circulating and been detected, compared to “new” material that has not been identified.

Providers must also assist the EU Centre – a planned new centralised hub to help fight CSAM – in conducting functional and security audits at the source code level to combat CSAM, the new text says.

In this context, the legislation aims to examine the underlying software of the platform being audited and how it is involved in the dissemination of CSAM.

Belgian EU presidency presents new risk assessment methodology for child sexual abuse law

A new document written by the Belgian EU Council presidency and seen by Euractiv outlines key details for the risk assessment that will form the backbone of a draft law to detect and remove online child sexual abuse material (CSAM).

Risk categorisation

The new text also lays out specific thresholds to categorise service providers based on just how risky their services are. Previous drafts outlined the categorisation criteria more broadly.

In the compromise text, indicators are weighted according to their impact on child sexual abuse risks, with resulting scores determining high, medium, and low-risk levels. Services exceeding 60% of risk indicators are high-risk, 25%-60% are medium-risk, and those under 25% are low-risk.

The text mentions Annex XIV for methodology, but the annex itself is not present in the document, similar to the previous compromise text from 27 March, where the risk categorisation annex was detailed separately.

The risk categorisation methodology must rely on provider reports to the Coordinating Authority. Different versions of self-assessment templates should be provided based on the size and type of services offered. Scoring should be based on criteria such as service size, type, architecture, safety policies, and user tendencies.

A new addition to the text specifies that the Coordinating Authority can ask providers to update risk assessments earlier than planned if evidence suggests a significant change in the risk of online child sexual abuse. This includes insights from providers offering low or medium-risk services.

EU Centre

The updated text broadens the EU Centre’s role to include advising the Commission on approving technologies for detecting both known and new CSAM or grooming.

It also allows the Centre to conduct source code level audits when providing opinions on technologies. Source code level means examining the actual programming instructions of the software or platform being audited.

The Technology Committee and Victims Board, comprised of experts and victims, will guide the Centre’s activities by advising on the approval of such technologies.

Online child sexual abuse: New compromise stresses risk assessment, detection, reporting

A new compromise text by the Belgian Presidency of the European Council, seen by Euractiv, about the draft law to detect and remove online child sexual abuse material (CSAM), puts the focus on risk assessment, detection orders, and reporting.

Coordinating Authorities and competent authorities

Member states should have the freedom to appoint a judicial or independent administrative body to issue specific orders, especially concerning the right to effective judicial redress against decisions made by competent authorities, the text says.

Age verification

The previous compromise text already said that age verification measures must prioritise privacy and the child’s interests, without user profiling or biometric identification.

The new version emphasises that age verification and assessment measures must respect the principles related to processing personal data, particularly focusing on principles like lawfulness, purpose limitation, and data minimisation.

[Edited by Eliza Gkritsi/Zoran Radosavljevic]

Read more with Euractiv

Subscribe to our newsletters

Subscribe