More Daily Fun with Our Newsletter
By pressing the “Subscribe” button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Service

The UK government has officially abandoned plans to allow artificial intelligence companies to train their models on copyrighted music without explicit permission from the creators.

Technology Secretary Liz Kendall confirmed the policy shift on 18 March 2026, stating that the administration "no longer has a preferred option" regarding the previous AI copyright reform proposals.

This move signals a massive victory for the British creative industry, which had spent years lobbying against what they described as a "land grab" by big tech firms.

The original proposal sought to introduce a broad copyright exception, allowing AI developers to scrape vast archives of music, literature, and art to train generative models.

Under that scrapped system, creators would have been forced to manually "opt-out" to prevent their work from being used, a process many argued was impossible to manage at scale.

Now, the government is moving toward a framework that prioritises the rights of artists over the rapid expansion of machine learning datasets.

The Government Reversal

The decision to scrap the AI copyright exception follows months of intensive consultations with stakeholders across the music, film, and publishing sectors.

Liz Kendall’s announcement represents a total pivot from the previous government's stance, which aimed to make the UK a "global AI superpower" by loosening intellectual property restrictions.

The initial plan would have allowed developers to use copyrighted material for training purposes for free, provided they obtained a licence before commercialising any final products.

However, industry bodies and legal experts argued that the damage would be done during the training phase, where the "essence" of a creator's style could be captured without compensation.

The Department for Science, Innovation and Technology admitted that the feedback received "overwhelmingly rejected" the opt-out approach.

The tech sector had argued that strict copyright laws would stifle innovation and drive AI startups to move their operations to the United States or China.

Despite these warnings, the UK government has decided that the protection of the £146 billion creative economy takes precedence over unregulated technological growth.

This creative sector currently accounts for roughly 7% of all jobs in the United Kingdom, providing a significant buffer for the national economy.

Officials are now looking at "market-led" solutions, where AI companies must negotiate directly with labels and publishers for the rights to use their catalogues.

The government has also indicated it will closely monitor ongoing litigation in the High Court, where several high-profile copyright cases are currently being heard.

The Creative Backlash

The primary catalyst for this policy reversal was a sustained and high-profile campaign led by some of the most influential figures in global music.

Sir Elton John, Sir Paul McCartney, Dua Lipa, and Coldplay were among the hundreds of artists who signed an open letter demanding transparency from AI developers.

The letter urged the government to ensure that AI companies are forced to disclose exactly what copyrighted materials are used to train their algorithms.

Artists argued that allowing AI to mimic their voices and styles without permission was not just a financial threat, but an existential one to the concept of human artistry.

Dua Lipa previously described the unauthorised use of artist data as a "theft of soul," echoing sentiments shared across the Glastonbury and Brit Award circuits.

The BPI, the trade body representing the UK’s recorded music industry, described the original "commercial research" exception as "deeply troubling."

They argued that such a loophole would grant tech giants excessive power to exploit the work of independent artists who lack the legal resources to defend their intellectual property.

The music industry has been particularly sensitive to AI developments following the rise of "deepfake" tracks that went viral on social media last year.

These tracks often used AI-generated vocals that were indistinguishable from real artists, leading to millions of streams for content that the original creators never sanctioned.

By scrapping the training exception, the government has sided with the argument that human-led creativity requires a protected environment to remain commercially viable.

Mapping the New AI Framework

With the broad copyright exception off the table, the UK is now pivoting toward a more granular regulatory framework focused on "digital replicas" and transparency.

The government plans to introduce strict labelling requirements for AI-generated content to ensure consumers know when they are listening to or viewing machine-made work.

New protections are also being explored to prevent the creation of digital replicas: AI clones of a person’s voice or likeness: without their express informed consent.

This move is intended to protect not just musicians, but actors, broadcasters, and public figures who are increasingly vulnerable to high-quality deepfakes.

Instead of a "one-size-fits-all" exception, the government is encouraging the development of technical standards that allow creators to bake "do not train" tags into their digital files.

These standards would be backed by law, making it a regulatory offence for AI scrapers to ignore the metadata attached to a piece of music or art.

There is also a growing focus on supporting independent creatives, who may not have the backing of a major label to negotiate complex licensing deals with tech giants.

The government is considering the establishment of a collective licensing body, similar to those used for radio airplay, to handle AI training royalties.

This would ensure that even smaller artists receive a fair share of the revenue generated by the AI models that are built on their work.

While the tech industry has expressed disappointment, claiming the UK may fall behind in the global AI race, the creative sector has hailed the move as a landmark moment.

The focus now shifts to the international stage, as the UK looks to coordinate its AI copyright policies with the European Union and the United States.

The era of the "Wild West" in AI training appears to be ending in Britain, replaced by a system that demands a "licence-first" approach for all commercial developers.

As the market for generative AI continues to evolve, the balance between technological progress and intellectual property rights remains one of the most significant legal battlegrounds of the decade.

Advertisement