
A broad coalition, including journalists, publishers, and film and music industry heavyweights, has weighed in against the latest draft of proposed regulations that will govern how companies can comply with Europe’s new AI Act. The coalition, which represents major copyright holders, says that the current draft will not ensure that AI companies respect the terms of either the AI Act itself or European Union copyright law.
The statement from news, media, and entertainment industry organizations heaps further pressure on the EU to either overhaul or reject the “code of practice” that will tell big AI companies how they should comply with the EU AI Act.
Friday’s copyright-related intervention comes days after some of the lawmakers who negotiated the AI Act warned that the third draft had heavily watered down requirements for the AI companies to test their models for “systemic risks” to society and fundamental rights. The former negotiators said the current wording was at odds with the AI law itself, and could allow serious harm to elections and the European economy.
The new statement from the major copyright holders said the draft code of practice “sets the bar so low as to provide no meaningful assistance for authors, performers, and other rights holders to exercise or enforce their rights” when the AI companies’ web crawlers encounter their content while looking for fresh data for their models. The AI Act obliges AI companies to respect rights holders’ wishes regarding the use of their copyrighted works for model training.
The new EU law, which is the most comprehensive of its kind anywhere in the world, will start to apply to providers of “general purpose AI” (GPAI) models such as OpenAI’s GPT-4o in early August, so time is running out for the industry to receive clear guidance. The code of practice, which is being crafted by external experts for the European Commission’s new AI Office, is now on its third draft. The fourth and final draft is expected in May.
“No code would be better than the fundamentally flawed third draft,” said the rights-holder groups. The group includes the likes of the International Federation of Film Producers’ Associations, the Federation of European Publishers, the International Federation of the Phonographic Industry, and the International Federation of Journalists.
The code’s drafting process is coming under heavy lobbying and political pressure from the U.S., where there is no federal AI law, but the copyright implications of AI development are also fiercely contested.
For the rights holders, one big problem is that the draft code doesn’t give them options for how to reject the use of their copyrighted works for training; the only method it treats as inviolable is the venerable but frequently ignored robots.txt protocol, which website operators can use to ask web scrapers to stay away. They complain that the draft code also doesn’t give AI companies any guidance on how to comply with such mechanisms.
The rights holders add that the latest version of the draft makes matters worse by no longer forcing AI companies to be transparent about how they comply with rights holders’ requests to protect their content from scraping.
“A key objective of the AI Act is to give authors, performers, and other rights holders tools to exercise and enforce their rights by requiring general-purpose AI providers to put in place measures to comply with EU copyright law and provide a sufficiently detailed summary of the content ingested and used for training,” they said. “The third draft of the GPAI Code of Practice represents yet another step away from achieving this objective.”
Approached for comment on Friday, the European Commission reiterated what it said earlier this week regarding the systemic-risks warning: that the code was a compromise, and the Commission would look at the final draft before deciding whether to approve it.
The American copyright debate
Over in the U.S., there is no sign that the deregulation-friendly Trump administration will try to introduce anything as expansive as the EU AI Act. However, there is pressure from the industry to at least establish some federal rules to preempt the hundreds of sometimes mismatched state-level AI bills, and also to provide more legal certainty—particularly on the copyright issue, where many high-profile lawsuits are yet to be resolved.
Evidence uncovered in one of those cases—where authors like Sarah Silverman and Ta-Nehisi Coates are suing Meta—revealed in January that Meta knowingly trained its models on a library of pirated books called LibGen. (The latest draft of the contentious EU code says AI companies only need to make “reasonable efforts” not to crawl piracy websites.)
The Trump administration is in the process of coming up with an “AI action plan” and recently asked for industry input. OpenAI recommended that the U.S. should give AI companies a clear copyright exemption, on the alleged basis that Chinese companies don’t respect copyright law, so U.S. firms would be disadvantaged if they had to do so.
Thomson Reuters, which recently won a case against a now-defunct tech firm that had copied its content to create an AI platform for lawyers, has come out against OpenAI’s proposals.
“We believe very strongly that copyright needs to be respected as part of the development process in AI,” chief product officer David Wong told Fortune on Thursday. “There is significant intellectual capital that goes into producing these works.”
Wong pointed out that Thomson Reuters is itself incorporating AI across its software platforms for sectors such as law and accounting—actually a far greater part of its business than its more well-known news offerings—so it wasn’t “trying to preserve an antiquated way of working.”
However, he said it was essential to “make sure there is fairness” in the way AI companies deal with “those that create the content that will power and feed those systems.” Without that balance, Wong said, content creators could decide to make it even harder for anyone to access their works.
The British government has also been considering adding a copyright exemption for AI companies, triggering an outpouring of outrage from the U.K.’s vital creative sectors. Technology Secretary Peter Kyle said this week that musicians and filmmakers should not “resist change or try to make change too difficult to deliver.”