EU assessment of child abuse detection tools based on industry data

"It is simply not meaningful on a technical level for a company to say it has run a test and state a certain accuracy when the exact test data is not known," Felix Reda told EURACTIV.  [Tero Vesalainen/Shutterstock]

The Commission used data on the accuracy and precision of AI tools to detect child sexual abuse material (CSAM) online exclusively from Meta and another tech company, an access for documents request filed by former MEP Felix Reda showed. 

Independent research, control tests or further details on the underlying datasets of the tech companies were disregarded in the impact assessment for the proposal of the CSAM regulation. 

The EU proposal includes the possibility of issuing court orders requesting the scanning of encrypted communications to detect CSAM, triggering concerns this could open the door to massive amounts of false positives of private content being disclosed to law enforcement.

To address this criticism, the European Commissioner for Home Affairs Ylva Johansson, who is leading on the file, published a blog post on 7 August stating that for the detection of new child sexual abuse material, there is technology available with a precision rate of 99.9%. 

The following day, Felix Reda, a former EU lawmaker who is now working for the non-profit organisation Gesellschaft für Freiheitsrechte, sent a Freedom of Information (FOI) request to the EU executive branch, asking for the sources of this precision rate. 

In its response via AskTheEU, which was delayed in weeks even after the deadline extension for exceptional cases, the Commission cited Meta and Thorn’s commercial product Safer as the providers of the data. According to Thorn, “latest tests” of Safer’s classification model show a 99% precision rate. 

“It is simply not meaningful on a technical level for a company to say it has run a test and state a certain accuracy when the exact test data is not known,” Reda told EURACTIV. 

In the open internet, it could be that the number of falsely detected content is significantly larger because the proportion of CSAM content on all messenger services is probably very small, Reda continued. 

The number of false positives is relevant because harmless messages, chats and photos of innocent people containing explicit content could end up on the screens of investigators. 

Germany hotspot for hosting child porn as Berlin wrestles with Commission over privacy

New data shows that the hosting of child pornographic content in Germany has exploded tenfold from 2020 to 2021, while Berlin has been busy questioning EU proposals to tackle the phenomenon.

Thorn did not provide EURACTIV with details on the datasets and methods for their tests in time for publication. 

Further, the Commission should have conducted tests itself instead of relying on companies that might end up profiting themselves as providers of chat control software in the future, Reda argues. 

LEAK: Commission to force scanning of communications to combat child pornography

The European Commission is to put forward a generalised scanning obligation for messaging services, according to a draft proposal obtained by EURACTIV.

The Commission presented its proposal to prevent and combat CSAM in May 2022, sparking backlash due to its suggestion of a generalised scanning obligation for messaging services and the risk of breaking up end-to-end encryption, as EURACTIV reported.

In July, the European Data Protection Board and the European Data Protection Supervisor adopted a joint opinion, saying that the proposal in its current form may present more risks to third-party individuals than to the criminals they intend to pursue for CSAM.

Next Monday, Commissioner Johansson will present the proposal to the Committee on Civil Liberties, Justice and Home Affairs (LIBE), which will start negotiations in Parliament. The Council is already working on the text but is not yet dealing with the controversial filtering obligations. 

The Commission did not immediately respond to EURACTIV’s inquiry to comment. 

[Edited by Luca Bertuzzi/Nathalie Weatherald]

Read more with Euractiv

Subscribe to our newsletters

Subscribe