
Key lawmakers in the U.K. have added their voices to a pushback against government plans that would allow AI companies to train their models on creative works without asking permission.
The government proposed a new copyright exemption for AI companies in mid-December. The consultation on the proposal caused waves of concern across the country’s creative industries, which account for over 5% of the British economy.
The sector’s campaign against the exemption went into overdrive ahead of the consultation’s closure on Tuesday. All of the U.K.’s major newspapers carried identical covers protesting against the change on Tuesday morning, while musicians such as Kate Bush, Damon Albarn, and Annie Lennox released an album featuring recordings of empty music studios and performance spaces, titled Is This What We Want?
On Wednesday, Parliament’s culture and technology committees released a joint recommendation to the Labour government, urging it not to introduce the exemption at this point.
“The groundswell of concern from across the creative industries to the government’s proposals illustrates the scale of the threat artists face from artificial intelligence pilfering the fruits of their hard-earned success without permission,” Caroline Dinenage, chair of the culture committee, said.
“The government should not introduce an ‘opt-out’ approach to the use of creative works for AI training, where all works are fair game unless creators say so, given that the technical measures to enforce these opt-outs do not yet exist,” she said.
Dinenage also called for “much tougher requirements on transparency of the data being used to train AI models, so creators will know without ambiguity where they need to be remunerated for the use of their works.”
What the U.K. does now is unlikely to affect AI models that have already been released: The big AI companies are widely believed to have hoovered up everything they could find on the public internet to train those models, without asking permission, and the models cannot now unlearn what they have absorbed. There are several lawsuits alleging copyright infringement pending in the U.S. against the AI companies that have created those models, but the outcome of those cases remains unclear.
That said, the AI industry is still in its infancy with many models yet to be released. There are also global uncertainties around the legality of the AI companies’ tactics. So the U.K.’s choices now could prove influential in the future, and possibly beyond the country’s borders.
The U.K. government is also keen to position itself as a potential hub for the development of AI models and attract inward investment from technology companies working on AI as well as to encourage the growth of AI startups launched in the U.K. It had hoped that having a copyright regime that made it easier to legally obtain data for training models might be a key element that would attract this sort of investment. And, in fact, a government-commissioned AI Opportunities Action Plan suggested that “copyright cleared” government-owned datasets be made available for AI training.
The parliamentary technology and culture committee chairs noted in a letter to the government that they had invited Google and OpenAI to appear at an evidence-gathering session, but the companies refused on the basis that the consultation was still ongoing.
“We chose not to press them, in part because we wanted to focus on arranging a productive session, and in part because the position of leading developers on the issue of copyright is clear,” they wrote. “Nonetheless, it is disappointing that they chose to decline our invitation. This stance is all too familiar to our committees, which share an interest in furthering the public’s understanding of how global companies develop, operate, and deploy their products, taking decisions that affect us all.”
The film director and lawmaker Beeban Kidron, who is a member of the British House of Lords and who has taken a leading role in the industry’s campaign, told Fortune on Wednesday that whatever ideas get taken up must “start with acknowledging that those that create work, whether individually, in groups, or in companies, have invested their money, time, and talent to do so, and as such own it.”
The Guardian reported that the government may partially back down on its plan, with one unnamed source saying: “There are ways to protect certain sectors which are particularly important, and to make sure big U.S. technology giants are not getting all the benefit.”
In a statement to Fortune, a spokesperson for the government’s Department for Science, Innovation, and Technology said that “no decisions will be taken until we are absolutely confident we have a practical plan that delivers each of our objectives, including increased control for right-holders to help them easily license their content, enabling lawful access to material to train world-leading AI models in the U.K., and building greater transparency over material being used.”