Main content

"Cultural extraction with a good vocabulary" - why Black music and culture are AI's most exploited victims


Chris Middleton Profile picture for user cmiddleton April 23, 2026
Summary:
Black music and culture are AI’s most exploited victims


Bob Marley
(Pixbay)

The recent Westminster Media Forum conference in London on music policy last week - see diginomica passim - provided a bumper crop of depressing findings:

• a vacuum at the heart of the British government on this policy issue
• the dire straits in which most of the world’s musicians now find themselves 
• the enforced training of AI systems by professional artists, who are being forced to collaborate in their own replacement
• the government’s misplaced faith in US AI companies’ ability to fix Britain’s economy in return for being handed artists’ IP for nothing 
• and the fact that a government that claims to be listening still won’t let artist rights campaigners talk to its Ministers.

That’s quite a list already - and there's more to come. 

Another speaker at the event explained that one community is being disproportionately affected by the industrialized IP extraction of US AI vendors, and their cavalier attitudes to the cultural, societal, and economic impacts.

In an impassioned, detailed, and evidenced presentation, Professor Mykaell Riley, Director of the Black Music Research Unit at the University of Westminster, set out how Black artists are bearing the brunt of the worldwide exploitation – and appropriation – of artists’ work.

According to Professor Riley, over 80% of the UK’s £30 billion ($40.5 billion) recorded music output over the past 30 years has originated from Black music genres. So, as AI-generated music proliferates, Black artists inevitably suffer the deepest impact and losses.

As he explained, over 50,000 AI-generated tracks are uploaded to Spotify every day, while Google’s Lyria 3 Pro system, launched last month, enables users to generate a three-minute song for just eight pence (10 cents) – none of which goes to the artists whose work was scraped to train the system.

Riley explained that his research looks at AI, data mining, and their combined effects on diversity and the talent pipeline. He said:

Last month, UK Music launched the Music Means Business report at Westminster, the first report of its kind in Europe. The headline was music originating from Black genres accounts for over 80% of the UK's £30 billion recorded music market. My own contribution to that report was the genre framework which I worked on with [curator, researcher, and public historian] Dr Aleema Gray as part of the Beyond the Bassline exhibition at the British Library, which looked at over 500 years of contributions by Black British musicians to British music.

So, here we have the two realities between which we are all sitting. On one side is the £24.5 billion [$33 billion] contribution from Black music [to the UK’s total output]. And on the other, a machine that can produce music for just eight pence, with no idea of where those sounds came from.

But surely, our supposedly superintelligent AIs can understand cultural context? After all, don’t they tell us about it? That is to mis-understand metadata and how machine intelligences work, he explained:

The Lyria 3 Pro model can generate liner notes that, for example, could name Soul II Soul, or describe Brit Funk as a London invention, or map the whole genealogy of UK R&B – I know this, because I've tested it. But there's a critical distinction that the room needs to understand. And that is the difference between narrative attribution and structural reciprocity. The AI can tell the story, but it cannot pay the creators. That eight pence per song simply covers Google's compute costs. It does not include a single penny flowing back to the communities whose music and traditions trained the model.

And the technology that's supposed to provide transparency, SynthID, the [DeepMind] watermark that Google embeds in every AI-generated track, simply tells us that a file was made by AI. What it doesn't tell you is which sonic DNA it drew from. So, you have a description layer that can narrate the history, and a watermark layer that can flag the output, but no value-flow layer connecting the two. Cultural knowledge was extracted during the AI’s training, but cultural attribution was not carried through to the output.

That's not cultural awareness, that's cultural extraction with a good vocabulary.

Unfit for purpose

Powerful words. The point is that metadata standards were designed for a world of single authors and western scales, added Riley:

They were never built for music like Grime, which can be traced back through Jungle and Dance Hall, through sound-system culture, all the way back to Kingston, Jamaica. None of that genealogy exists in the metadata field at the moment – and certainly not in a way that pays creators.

So, when an AI trains on those patterns and generates something new, the cultural DNA simply vanishes: no attribution, no royalty. You can build a Creative Content Exchange, and you can improve streaming attribution, but if the metadata itself doesn't carry any cultural provenance, then you are protecting a structure while the foundations remain invisible.

Of course, a listener might then use AI to explore the deep roots of a genre themselves, in which case we can only hope that the information they uncover is both accurate and ethically sourced, with attribution and payment to its researchers. (It almost certainly isn’t, of course, as multiple lawsuits attest.)

But none of that helps artists if there is no mechanism to pay them for their work – it is the equivalent of placing all living culture in a museum, in fact, and replacing it with one generated by machines in a cultural vacuum. (No wonder AI is called the Fourth Industrial Revolution.)

The deeper problem is that these problems feed back into education, and thus have direct impacts on “diversity in talent pipelines and on the long tail of developing change in this sector”, said Riley, adding:

I'm saying this as a practitioner. I'm going to begin with one of the very first exhibitions that I worked on called Bass Culture [2018, which explored the history of Jamaican and Jamaican-influenced music on British culture]. Its importance was that this research, which took five years to develop, removed Form 696 from London policing.

Form 696 was a police initiative to force local authorities to disclose what type of music was being played at live events, including the names, addresses, and ethnicity of artists. Widely seen as a racist policy of suppressing specific genres, particularly Grime, the policy was eventually scrapped in the face of an outcry by Black and other artists, including Irish musician Feargal Sharkey. Riley continued:

That led to a second major exhibition, Beyond the Bassline, which took place at the British Library, the narrative of which was Black British music contributions all the way back to the Tudor period of Henry VIII. And right now, I am working on an AHRC [Arts and Humanities Research Council] funded project titled Equalize, which is looking at the absence of black music in schools. That’s important, because if music of Black origin generates 80 percent of the UK market, but that music isn't being taught in schools, then you have an economic pipeline with no investment at the beginning of said pipeline.

Which is just like the AI sector today, of course: a revenue pipeline for vastly wealthy tech corporations, which is disconnected from both artists and their cultural heritage, except in the most superficial of terms. Riley concluded: with a simple proposition: 

The UK’s Data Use and Access Act and the Creative Content Exchange [proposed by the government in January, as part of the Creative Industries sector plan] need to address not just who owns the music, but where that music comes from. We need cultural provenance in metadata. Not just a description layer that tells the story after the fact, but a value-flow layer that ensures that the communities who created these traditions – in this case, black British music traditions – are structurally connected to the economic systems that are built on top of them.

My take 

A superb presentation that gets to the heart of issues that are rarely considered in the acrimonious debate about artists’ IP, consent, and payment: namely, wholesale cultural appropriation, as well as economic exploitation.

Image credit - Pixabay

Loading
A grey colored placeholder image