Some rather extraordinary news has broken regarding the European Commission’s attempt to force tech companies to scan users’ uploads and private messages for child sexual abuse material (CSAM).
According to a lengthy investigation by a group of European news outlets, the proposal followed close coordination between Home Affairs Commissioner Ylva Johansson and the U.S. company Thorn, which was founded by actors Ashton Kutcher and Demi Moore 11 years ago to fight the scourge of online CSAM. (Thorn ejected Kutcher as board chairman earlier this month, due to his scandalous defense of former colleague and convicted rapist Danny Masterson.)
“We have shared many moments on the journey to this proposal,” Johansson wrote to Thorn executive director Julie Cordua, according to the Balkan Investigative Reporting Network’s English-language report of the investigation. “Now I am looking to you to help make sure that this launch is a successful one.” Days later, in May last year, Johansson unveiled her proposal for the CSA Regulation, which is currently being scrutinized by the European Parliament.
Few oppose the objective of fighting CSAM, but cryptography experts have blasted the proposal—dubbed "chat control" by its opponents—saying you can’t alter end-to-end encryption systems to allow this sort of client-side scanning without busting people’s privacy and security. That’s also pretty much what Apple said when defending its decision to abandon CSAM scanning on iCloud. The EU’s own internal legal service warned earlier this year that the proposal could bring about “permanent surveillance of all interpersonal communications” and would probably be nixed by the courts.
Thorn is a registered nonprofit, but it sells a system called Safer that federal agencies and tech companies—like Slack, Flickr, GoDaddy, and even OpenAI—use to spot what is known or suspected to be CSAM. Known images are identified by comparing the hashes of uploaded images to those stored in a vast database (this is the element of Safer that OpenAI uses), while new images are identified as CSAM by AI.
Despite the apparently commercial nature of this operation, which would obviously benefit from the CSA Regulation as proposed, the report notes that Thorn is “registered in the EU lobby database as a charity” and has had meetings under that classification with several EU commissioners. The organization apparently paid over $630,000 for lobbying last year. The report also highlights Europol officials' suggestion to the Commission that "there are other crime areas [apart from CSAM] that would benefit from detection." Meanwhile, privacy activists—who are more than a little alarmed at the implications of such message-scanning systems—say they have struggled to get the Commission’s attention.
“The investigation published today confirms our worst fears: The most criticized European law touching on technology in the last decade is the product of the lobby of private corporations and law enforcement,” said Diego Naranjo, policy chief at digital rights organization EDRi, in a statement.
Thorn insists that its income from Safer sales does not generate any profits, and says it remains reliant on donations to cover the rest of its costs. Regarding its influence, it said in an emailed statement: “Our technical expertise is unique. We make this expertise available to policymakers to support the EU’s legislation in this space.”
Johansson’s office did not respond to a request for comment. More news below.
Want to send thoughts or suggestions to Data Sheet? Drop a line here.
David Meyer