AI and copyright – musicians’ struggles against AI’s onslaught strike a wrong note
- Summary:
- The world’s musicians are all privileged and rich, and AI levels the playing field, right? No, that could not be further from the truth. More revelations from last week's Westminster Media Forum.
With the UK’s creative communities reeling from revelations that UK Government ministers have, despite letters, pleas, and repeated lobbying, not even started a conversation about AI and copyright with leading artists’ organizations, more depressing detail emerged last week from a Westminster policy conference about just how badly artists have been hit by the technology.
Natasha Mangal is Legal and Policy Advisor, Creator Relations, at CISAC – the International Confederation of Societies of Authors and Composers. Founded a century ago, CISAC is the world’s leading network of authors’ societies, with over 220 member organisations in more than 110 countries, representing more than five million creators in music, audiovisual entertainment, drama, literature, and the visual arts.
In short, it defends the interests of the very creatives whose work has been extracted at industrial scale by AI vendors without consent, credit, or payment. Speaking at the Westminster Media Forum conference on Music Policy, Mangal said:
In 2024, CISAC released a global economic study on the impacts of AI on the creative industries. It found that, in the music sector alone, 24% of creators' revenues would be at risk by 2028 due to the substitutive effect of AI outputs on the marketplace. This represents a cumulative loss of over €10 billion [$11.7 billion] over the next few years.
That study is now two years old, and the situation has worsened dramatically since then, with the growing influx of AI-generated tracks and artists flooding popular streaming platforms. More on that in a moment.
However, Mangal proved to be a model of concision, focus, and bluntness in response to the financial and technological cataclysm. There are four urgent policy priorities, she said, given what she described as “these very high economic stakes”:
First, regardless of country or legal system, transparency is the key to unlock the licensing potential of generative AI models. AI model providers need to be held fully responsible for disclosing the sources and use of protected works, and rightsholders should have the ability to audit such disclosures.
This is an ongoing battle in the EU, she explained, where the recent Voss Report highlighted that the intentions behind the EU AI Act’s transparency requirements are not being met, and the template of disclosure does not go far enough. She continued:
Second, AI system developers should bear full responsibility for the choices they make regarding data sourcing, and they should not pass that responsibility on to end users. So, AI services should focus on providing users with services which use legal, fully licensed copyrighted content. The UK’s proposed Creative Content Exchange could play a role in this.
(At this point we must acknowledge one victory, at least, in legal actions against AI vendors for copyright theft. Last autumn, gen AI music vendor Udio was forced to become a fully licensed platform by the Universal Music Group, with similar actions ongoing against Suno and others. Plus, US authors’ class action against Anthropic secured a $1.5 billion settlement last year for the scraping of pirated books.)
Mangal went on:
Third, licensing solutions, including collective licensing models, should enable lawful access to training data at scale while maintaining creators' incentives to create. Music societies can provide catalogues of high-quality, diverse, and culturally rich repertoires, which would ultimately improve an AI provider's ability to provide better services. And finally, the UK should definitively remove its prior preferred option for a commercial TDM exception, and instead make room for considering the development of licensing markets that strengthen the enforcement of rights, making sure that transparency obligations are not circumvented because of territorial restraints.
That’s good advice. On 18 March, the British Government did walk back its preferred option in the face of widespread protests and opposition, but it is not clear whether it has been scrapped or is merely being tweaked for reintroduction in some modified form. As I noted in my previous report, this government still has time to shoot itself in the foot – and is highly adept at doing so. (Why doesn’t it just enforce UK copyright law?).
Further details on Downing Street’s plans are unlikely before 6 May, which is the deadline for government’s formal response to the UK Government's House of Lords’ Communications and Digital Committee’s report on AI and Copyright (see diginomica, passim). Like the Committee’s earlier Inquiry into Large Language Models, that report urged the government to support Britain’s creative sector, enforce copyright laws, and explore paid-for, opt-in licensing models.
Taking a stand
Countries can and should take a principled position, Mangal explained:
Australia has taken a firm stance in rejecting the introduction of a commercial TDM exception with opt-outs. And the EU is now in the process of conceding, as seen in the European Parliament's report, that the opt-out system, as it was envisioned, is not a workable system for rightsholders, particularly in controlling and regulating permissions of online digital content.
Indeed. And the Lords’ Committee made similar points in its report this year, which urged ministers to follow Australia’s example. Mangal added:
Further legislative intervention is now required to ensure that the promises of the EU AI Act are carried out as intended. Above all, the UK should learn from this experience and start to support and investigate how voluntary licensing and other licensing models can be promoted.
Music’s bleak midwinter
Next up was Deborah Annetts, CEO of the Independent Society of Musicians (ISM), Chair of the Creators’ Rights Alliance, and source of last week's jaw-dropping revelations of zero meetings. Not only have ministers from this 'listening' government failed to even start a conversation with creative-sector leaders, she said, but US AI vendors promised politicians they would kickstart Britain’s economic growth – but only if the Government handed them artists’ IP for nothing. In a staggering display of naivete, it seems that ministers actually believed them.
That some in government live in a world of vendor make-believe is all too evident, Annetts said:
I’ve found myself talking to civil servants who have told me how much better the jobs of musicians are going to be in the future, thanks to AI – they were all going to get better jobs! To which I replied, ‘But what is a better job if you're a musician? You just want to be a musician!
As CEO of the ISM – founded in 1882 by Sir Edward Elgar – Annetts has a grassroots perspective on this, courtesy of the Society’s 11,000 professional members across every genre. It is vital to understand that the vast majority of musicians are already struggling, she explained, regardless of AI’s new impacts on their careers. For every Paul McCartney or Taylor Swift there are tens of thousands of talents who are barely surviving in the financial margins:
Ninety-three percent of our members are freelance. That means it's already difficult for them to get mortgages, it’s difficult for them to manage debt, and it’s incredibly difficult for them to manage the new provisions coming through HMRC in relation to tax. We're hearing lots of terrible stories happening there.
Annetts is referring to the UK’s compulsory Making Tax Digital (MTD) scheme, which imposes new obligations on self-employed taxpayers to use paid-for cloud services and maintain constant digital records – work that would normally be done by an accountant. This increases freelancers’ financial costs and their administrative burdens, while giving them nothing in return. But what has this got to do with AI, beyond being yet another technology that has been foisted on the electorate without consent? Annetts explained:
The point is the freelance musician lifestyle is already one of not knowing if you're going to get another job, not knowing if you can pay your costs, and not knowing if you can keep a roof over your head. Plus, promoters are notorious for walking away without paying performers, but most musicians are too scared to ask for their money [in case they don’t get another gig]. It's grim, and added to that, we've had COVID [which shut every venue in the world], Brexit, and now AI. I sometimes wonder why musicians keep doing what they do. And I can only think it's because they love doing it.
For British musicians, Brexit alone has been a catastrophe, said Annetts, before we even get to AI:
Our research shows just how dire Brexit has been for musicians, reducing their income by about 50%. But every time I have a conversation with civil servants, they'll say, ‘Yes, it's on our to do list, but it's terribly difficult and the EU wants something back.’ Well, musicians need them to do that work, given the impact of AI!'.
So, what is that impact? In January, Annetts’ organisation published a report on generative AI called Brave New World. Its findings are bleak, she said:
Seventy-three percent of musicians say un-regulated AI threatens their ability to earn a living. Fifty-three percent say they have already lost work to gen AI, and 17% of respondents report being forced to undertake AI-related work...Musicians are being pressurised into doing AI stuff [training AI systems]. Many describe the loss of session and songwriting work as studios have replaced musicians and composers with AI-generated alternatives. And I have been in board meetings with session musicians who told us how they've been forced to record sounds to train AIs, which they know will replace them on tracks in the future.
“Our investigations have also revealed lost commissions worth £10,000 or more. And only seven percent of musicians have ever been approached to license their work for gen AI training, and fewer than one in five of those have ever received any payment. In total, that's about one percent of musicians [who are paid by AI vendors]. Meanwhile, 65% of performers and 93% of voice artists see AI as a threat too. And on that point, there a massive issue around personality rights. Like us, [actors union] Equity is seeing personality rights being impacted negatively for creators.
This is the phenomenon of AI generating compositions in the unique style, sound, voice, and musicality of a human artist, but without consent or payment. This exploits their talent, pollutes their repertoire, and diverts income to fake acts. Actors face similar threats to their livelihoods, with AI able to exploit their likenesses, personalities, and performances, again without consent or payment. Several on-demand video companies now exist which sell ‘synthespian’ performances trained on the work of human actors – and virtual fashion models exist too.
Legislation in this area cannot come soon enough, said Annetts, who explained:
Whatever is going on with tech firms in terms of licensing at the top level is certainly not finding its way down to ISM members. One of the questions I keep asking, in relation to the value chain, is, ‘Is the ISM member ever going to get paid in relation to their work, which has been scraped by tech companies without permission?’ So far, given the structure of our legislation and the way collective management organisations work, the answer is no. So, even though my members' work has been stolen, they are never going to see any payment. And that is frankly just wrong.
She added:
We know that US tech companies love to come in, break things, and walk away without paying for anything, but we urgently need to protect our creative industries, which are worth £125 billion [$169 billion] per annum to the UK and employ 2.4 million workers. By comparison, the UK’s AI companies are worth just £11.8 billion and employ under 100,000 people.
And as we have seen, even Britain’s AI start-ups disagree with the government’s approach to AI and copyright. Trade organisation UKAI’s own report last year described Downing Street’s proposal to opt creators into their work training AI by default as “misguided”, “damaging”, and “divisive”. Yet the UK Government simply ignored it.
My take
Full marks to Mangal, who said more in just five minutes than some keynotes do in forty, a presentation that was 100% signal and zero noise. By contrast, the UK Government’s own report on AI and copyright, released last month in the long tail of its 2025 public consultation, took 125 pages to say almost nothing, except that the Government is listening and assessing the situation. (Is it?)
As to Annetts point about musicians carrying on through love of what they do against the AI odds, as a part-time professional musician myself, I can confirm that this is so. And I would add that musicians, songwriters, composers, and self-employed recording artists also now exist in a market where technologists have taught listeners that music is free, and 90% of songs receive no streaming royalties from Spotify. The remaining ten percent receive a maximum of one dollar (frequently a lot less) for every thousand streams – and that is only if they own intellectual rights in the song. I’ve chatted several times with music legend Gary Numan, who told me recently he received roughly $38 in one year for a million streams of a track – bear in mind, recording contracts only pay artists a small percentage of profits once they have earned back any advance. My own band has received less than $80 for tens of thousands of streams on major platforms – and that is revenue, not profit. Such a world does not come close to covering creators’ costs.
And along comes AI...
According to Annetts, the problem is that politicians have got a partial, US vendor-influenced view of the impact of AI. So, the fact that Britain’s ministers have yet to even start a conversation with creative communities, despite leaders like Annetts lobbying them repeatedly for a meeting, remains mindblowing and bears repeating.
Meanwhile at the same Westminster Forum conference this week, Professor Mykaell Riley, Director of the Black Music Research Unit at the University of Westminster, painted a horrifying picture of how the world’s Black artists are bearing the financial brunt of AI, while dealing with wholesale cultural appropriation. We'll deal with that topic separately.