The United Kingdom government has formally abandoned plans to allow artificial intelligence companies to train their models on copyrighted music and creative works without the explicit permission of the original creators.
Technology Secretary Liz Kendall confirmed the reversal on 18 March 2026, stating that the administration no longer holds a preferred option for AI copyright reform that would favour tech developers over rightsholders.
The decision follows years of intense lobbying from the British creative industries and high-profile interventions from some of the world’s most successful musicians.
Under the previous proposals, the government had considered implementing a broad text and data mining (TDM) exception that would have permitted AI firms to scrape copyrighted material for training purposes.
This mechanism would have operated on an "opt-out" basis, meaning AI companies could use any licensed work unless the creator specifically acted to prevent it.
The scrap of this policy is being hailed as a landmark victory for intellectual property rights in the age of generative technology.
It marks a definitive shift in the UK’s approach to balancing the growth of its technology sector with the protection of its world-leading creative economy.
The Policy Reversal and Its Origins
The origins of the dispute date back to a 2022 proposal by the Intellectual Property Office (IPO), which suggested a TDM exception to make the UK a more attractive hub for AI investment.
The logic at the time suggested that removing the financial and administrative burden of licensing millions of individual songs, books, and images would accelerate the development of domestic large language models and generative audio tools.
However, the creative sector argued that this move would essentially legalise the unauthorised use of human expression to build commercial products that could eventually replace human creators.
Liz Kendall’s announcement on 18 March 2026 clarified that the government has listened to these concerns and prioritised the integrity of the copyright framework.
The Technology Secretary noted that the government’s priority is now to ensure that the UK remains a place where "creativity is respected and rewarded."
The scrapped proposal would have allowed AI developers to use licensed creative works for training without seeking specific permission, provided they did not commercialise the end products without a secondary license.
Rightsholders argued that the "training" phase is where the value of their work is most effectively harvested, making a training-level exception unacceptable.
Industry experts noted that the TDM exception would have put the UK at odds with other major jurisdictions, including the European Union, which has implemented stricter transparency and consent requirements through the EU AI Act.
By retreating from this position, the UK government is seen to be aligning more closely with a "permission-first" model that treats data as a commodity with inherent value.
The Intellectual Property Office is now tasked with developing a new framework that balances technological innovation with the economic rights of photographers, authors, and musicians.
This process will involve a renewed focus on transparency, requiring AI companies to disclose the datasets used to train their algorithms.
The Creative Industry’s Unified Resistance
The scale of the opposition to the government's initial plans was unprecedented in the history of UK copyright law.
Over 11,500 individual consultation responses were submitted to the government, the vast majority of which rejected the proposed data mining exceptions.
The House of Lords also played a critical role, voting against the measures twice and highlighting the potential for "irreparable harm" to the UK’s cultural exports.
Musicians including Sir Elton John, Sir Paul McCartney, Dua Lipa, and Coldplay were among those who publicly criticised the government’s direction.
Björn Ulvaeus of ABBA and Max Richter were also vocal, with Richter testifying before parliamentary select committees regarding the ethical implications of AI training.
The core of their argument focused on the "human element" of creativity, suggesting that AI models are essentially "copying machines" that rely on the lifetimes of work produced by humans.
UK Music, the collective voice for the UK music industry, led much of the technical opposition, representing over 220,000 people working in the sector.
The organisation pointed to the fact that the UK music industry generates approximately £8 billion annually for the national economy.
Tom Kiehl, CEO of UK Music, described the government’s reversal as a significant relief for those whose livelihoods depend on copyright protection.
He stated that the sector is not against AI technology itself but insists that its development must happen within a legal framework that pays creators for their input.
The backlash was not limited to music; the publishing and visual arts sectors also joined the coalition, arguing that a TDM exception would lead to a "creative vacuum."
Photographers expressed concerns that their digital portfolios would be used to train image generators that could then produce "in the style of" work, directly competing with the original artists.
The unity across these diverse fields proved difficult for policymakers to ignore, especially given the creative industries' status as one of the fastest-growing sectors of the UK economy.
The government’s decision is seen as an acknowledgment that the economic value of the tech sector cannot be grown at the direct expense of the arts.
The Future of AI Licensing and Economic Stability
With the TDM exception now off the table, the focus has shifted to how a functional, voluntary licensing market can be established.
The government has announced the launch of a "Creative Content Exchange" pilot, scheduled to begin in the summer of 2026.
This pilot programme aims to provide a streamlined framework for AI companies to secure licenses for copyrighted material in a way that is efficient for tech firms and profitable for creators.
It is expected to function as a clearinghouse, allowing for bulk licensing of creative catalogues while ensuring that royalties are distributed correctly.
Transparency remains a central pillar of the new strategy, with upcoming requirements for AI firms to label AI-generated content and provide clear audits of their training data.
This move is intended to prevent the "black box" nature of AI development, where creators are often unaware that their work has been used until a model produces a derivative output.
The UK’s choice to protect creators could influence international standards, as the US and other nations continue to grapple with "fair use" definitions in the context of machine learning.
Legal experts suggest that the UK is now positioning itself as a "high-standard" jurisdiction for intellectual property, which may attract creators and media companies looking for legal certainty.
However, some technology advocates warn that without easy access to data, UK-based AI firms may struggle to compete with companies in countries with more permissive data-scraping laws.
The government maintains that a "gold standard" of copyright protection will ultimately lead to higher-quality, more reliable AI systems that are built on ethically sourced data.
The £8 billion music sector, and the wider £125 billion creative industries, are now viewed as essential partners in the AI revolution rather than obstacles to it.
The "Creative Content Exchange" will be closely watched by global observers as a potential blueprint for how modern economies handle the intersection of human talent and automated processing.
For now, the message from Westminster is clear: the rights of the artist are not for sale in the pursuit of technological speed.
The "victory" for artists is a signal that the UK values the human origin of its cultural output as a unique and protected economic asset.
As the summer 2026 pilot approaches, the dialogue between Silicon Valley and the UK’s creative hubs is expected to move from legal confrontation to commercial negotiation.