Fact-checkers call out Commission on anti-child abuse material proposal

European Commissioner for Home Affairs Ylva Johansson and Vice-President of the European Commission for Democracy and Demography Dubravka Suica give a press conference on a child rights package, in Brussels, 11 May 2022. [EPA-EFE / OLIVIER HOSLET]

The Commission made some false or contradictory statements in promoting the initiative to combat Child Sexual Abuse Material (CSAM), according to researchers at TU Delft, a leading Dutch technical university.

The CSAM proposal, which was put forward in May 2022 as part of a broader package on children’s rights, has proved controversial, attracting criticism from privacy advocates over its potential impact on encrypted services. 

TU Delft found that, out of six public statements made by the Commission in support of its new measures legislative initiative, three were incorrect. It is not the first time the Commission’s arguments on this proposal have been called into question.

In October, documents obtained via a Freedom of Information request revealed that a figure Johansson mentioned to back the accuracy of tools to detect CSAM was an unchecked assertion from two private companies.

EU assessment of child abuse detection tools based on industry data

The Commission used data on the accuracy and precision of AI tools to detect child sexual abuse material (CSAM) online exclusively from Meta and another tech company, a FOI request by former MEP Felix Reda has shown. 

The encryption question

How digital platforms should deal with illegal content like child sexual abuse material is already regulated in the recently adopted Digital Services Act.

On CSAM specifically, until the end of 2025, there is a temporary framework in place, the ePrivacy Derogation, that allows service providers to put voluntary measures in place to detect this material in a non-encrypted environment. 

In less than two years, the legal basis allowing for voluntary efforts by internet companies to detect child sexual abuse online will come to an end in the EU,” a Commission spokesperson told EURACTIV. “The scale and seriousness of the crime demand we act.”

With the new proposal, the EU executive went one step forward, introducing the possibility for judges to issue detection orders.

In other words, if a judicial authority considers that there is a significant risk a messaging service like WhatsApp or Signal might be used to disseminate CSAM, it might request the platform to disclose the relevant communications data.

The problem is that these messaging services use end-to-end encryption, a technology that allows only the users involved in the communication to read the messages. While the Commission says the legislation is technology neutral, any technical solution so far envisaged would defeat the purpose of end-to-end encryption.

In July, the European Data Protection Supervisor and European Data Protection Board issued a joint statement raising concerns about “the impact of the envisaged measures on individuals’ privacy and personal data”. 

EU anti-child abuse law could open doors for mass surveillance, report says

Security experts warn that the tools used by tech giants to detect child abuse online pose serious security and privacy risks, raising concerns around upcoming EU legislation.

Scale of the problem

TU Delft’s report fact-checked several public statements Commission officials made to defend the proposal.

In an interview with the Dutch newspaper Trouw, Commissioner Johansson stated that the Council of Europe (CoE) estimated that one in five children is a victim of online sexual abuse.

The researchers clarified this was a misrepresentation of the CoE’s findings, which were that one in five children is a victim of sexual abuse, which includes both on and offline cases.

The Commission, which took more than three weeks to reply to EURACTIV’s request for comment, acknowledged that this figure was not specific to online abuse but stated that there is evidence to indicate that the majority of such abuses have both an online and offline component.

In particular, the EU executive referred to an unpublished survey conducted with law enforcement authorities that indicates over 70% of child sexual abuse cases have an online component. The Commission also cited another study suggesting the figure might be higher.

During the press conference to present the proposal, the Commission stated that CSAM reports have grown by 6,000% in the last decade, contradicting the factsheet issued at the same time in which the figure cited is 4,200%.

The EU executive considers the 6,000% figure to be true, citing data from the National Centre for Missing and Exploited Children (NCMEC) from 2010 to 2020.

LEAK: Commission to force scanning of communications to combat child pornography

The European Commission is to put forward a generalised scanning obligation for messaging services, according to a draft proposal obtained by EURACTIV.

CSAM hosting

At the same press conference, Johansson said that servers in the EU host 90% of the world’s CSAM. However, in its own factsheet, the Commission puts the figure at ‘over 60%’.

One statistic quoted by Johansson in her interview was that 45% of global CSAM material is hosted in the Netherlands.

TU Delft concluded that this claim was difficult to verify but noted that while sources list different proportions of the worldwide total as being stored on Dutch servers, what is consistent across them is the finding that rates in the Netherlands are high. 

Finally, the Commission’s statement that 85 million photos and videos of CSAM were intercepted throughout 2021 was found to be true and based on figures reported by the US NCMEC.

The Netherlands provides global hub for child pornography

The European Commission’s proposal to fight child sexual abuse material (CSAM) online is still pending, while the EU has become a ‘destination of choice’ when it comes to hosting such content, according to a new report.

[Edited by Nathalie Weatherald]

Read more with Euractiv

Subscribe to our newsletters

Subscribe