Menu
Subscribe to Holyrood updates

Newsletter sign-up

Subscribe

Follow us

Scotland’s fortnightly political & current affairs magazine

Subscribe

Subscribe to Holyrood
by Sofia Villegas
02 July 2025
Creative Crossroads: Will AI kill the arts?

This image was created using Adobe Express AI image generation software. it took about six seconds. The prompt we gave it was: “Create an illustration about the incompatibility of artificial intelligence and copyright

Creative Crossroads: Will AI kill the arts?

I arrive at the Scottish Parliament with a few minutes to spare for a session on the future of the creative industries. As I take my seat, Francis Bondd slides in beside me, flashing a wide grin, and says, “I’m an engineer during the week, rockstar by the weekend.”

He is dressed in a black suit and wearing a string of white pearls and a single silver hoop, clearly trying to channel his inner star. Originally from Nigeria, the self-taught guitarist moved to Scotland two years ago. “As a migrant, when you walk into the room, the first question is always, ‘what’s he going to sound like?’” he tells me. “I have to prove myself 10 times more than anyone else.”

Just shy of 9,000 followers on Instagram, Bondd has high hopes of breaking into the music scene, with a potential gig at the Fringe on the cards. As we wait for the session to start, his expression shifts. “I’d feel terrible,” he says quietly, “if AI takes all that away from me.”

This fear of impending theft is one shared by thousands of artists across the UK. Artificial intelligence has advanced to the point that tools like ChatGPT and DALL·E 2 can now generate entire stories or detailed images in seconds. The increasingly blurry line between technology and art has triggered one of the biggest political standoffs of the year.

Since May, Baroness Beeban Kidron has led the charge in the Lords, pushing to amend the Data (Use and Access) Bill so that tech companies would be required to disclose their use of copyrighted material when training AI models. She argued failing to do so would allow for “widespread theft,” a stance backed by high-profile figures such as Elton John, Dua Lipa, and Paul McCartney.

But the Commons repeatedly rejected the measures, sending the bill into a game of legislative ping-pong. Neither seemed willing to back down, until the government finally blocked the amendment, citing financial privilege – a parliamentary rule that allows the Commons to overrule a Lords proposal that has public spending implications.   

The move has since triggered widespread concern about what it signals for the long-term survival of the creative industries, which contribute £124bn annually to the UK economy.

“I feel gutted... Transparency is the most basic right we [artists] deserve in order to enforce copyright effectively,” says Clementine Collette, a fellow at UK Research and Innovation’s Bridging Responsible AI Divides (Braid) programme. Her disappointment is echoed right across the sector. 

While discussing the future of Scottish gaming, Jade Law, chief executive of Wardog Studios, tells Holyrood: “I found my own work in some of those [AI] datasets... I’m so upset... It’s all there and there’s nothing we can do to stop it.”

The growing concern on the intersection of AI and the creative industries has also made its way to Holyrood. Earlier this month, Conservative MSP Sandesh Gulhane put First Minister John Swinney on the spot after ScotRail – which is owned by the Scottish Government – allegedly used Scottish artist Gayanne Potter’s voice to create its AI train announcer ‘Iona’ without her consent.

This image was created at the same time as the image on the previous page. The software gave us a few options, we used our creative judgement to choose which one to make bigger

Swinney confirmed the train company was “fixing” the issue. Scotrail told Holyrood it is in talks with ReadSpeaker - the tech company that developed the announcer - and “will provide a further update in due course”. Adding to the controversy, ScotRail’s Iona system is not listed in the Scottish AI Register – a public database intended to provide transparency around the use of AI in projects developed by public bodies. The register, launched over a year ago, was meant to start with mandatory entries from government departments as part of a phased rollout across the wider public sector. A Scottish Government  spokesperson told the magazine “the intention is to extend [the register] to all public sector bodies in the next few months”, including ScotRail.

Despite the seemingly polarised debate, most artists are not entirely against AI itself. A report by the Design and Artists Copyright Society found that while nearly three-quarters of UK artists are concerned about their work being used to train AI models without permission, 84 per cent said they would be open to sharing their work if it were under a proper licensing agreement.

Perhaps unsurprisingly, the problem is that tech firms believe copyright and AI are incompatible. In its submission to the Lords’ Communications and Digital Committee inquiry on large language models OpenAI, the parent company of ChatGPT, said that “because copyright today covers virtually every sort of human expression... it would be impossible to train today’s leading AI models without using copyrighted materials”.

And the UK Government seems to agree. In January, Prime Minister Keir Starmer called AI the “defining opportunity of our generation” while unveiling his AI Opportunities Action Plan. Among its 50 commitments, there was a pledge to reform the UK’s text and data mining regime as the “uncertainty around intellectual property” was a barrier to innovation and a threat to the growth of creative industries. For those in the arts, that sounded less like an opportunity and more like a warning.

The plan was launched weeks before the closure of the government’s AI and copyright consultation, which ended in February – a timeline which left many feeling the decision had been made before their voices could be heard. 

Law says: “Starmer is about to throw creatives under the bus in order to give big tech companies what they want but that completely conflicts with the UK having a gold standard for copyright law.”

Collette, along with her fellow Braid researchers, responded to the government’s consultation, urging ministers to make AI tools adopt an opt-in model. Under this approach, individuals would need to actively give permission for their data to be collected or used, compared to the opt-out system operated by many generative AI models, which assumes users agree to data collection by default. 

I feel gutted... Transparency is the most basic right we [artists] deserve in order to enforce copyright effectively

She claims the opt-out model puts the onus on creatives to protect their work and could “undermine” and “compromise an authors’ legitimate rights to control their own IP”. Like Law, she argues that “the government has massively prioritised growth and this kind of pipe dream of technological innovation”.

Caterina Moruzzi, another Braid research fellow, argues for a more “granular” approach, whereby creators and audience would have “more agency in the process”. 

She tells Holyrood: “Without that fine-grained control, it would be very difficult to say who is responsible for what aspect of the creation. So yes, in achieving this more fine-grained detail in not just how we create, but also when we engage with content online, understanding what part was created by whom, whether it was modified, the whole concept behind these standards for content, provenance and authenticity is the only way forward. 

“We’ve already crossed that line where we can’t go back. But this direction of more granular control is at least how we can manage to avoid some of the biggest challenges.”

Earlier this year, generative AI platform Invoke, which uses this layered approach, was granted the first-ever copyright protection in the US for an AI artwork on the basis that it involved “a sufficient amount of human original authorship”. Rather than a simple text input, the team selected, coordinated, and arranged numerous AI-generated image fragments into one piece, and the US Copyright Office agreed such process mirrored that of a collage artist. 

It is this new type of tech-enabled art that “excites” Moruzzi. She explains: “The opportunity that they [AI tools] give to create new forms of creativity, not replacing what we already know and value as art, but maybe expanding the horizon of what can count as creativity and as art.”

However, she recognises that if AI’s influence in the creative industries continues to grow unchecked, the consequences could reach far beyond the sector itself. Most of the leading generative AI models are developed and trained in the west – primarily in the US and UK – meaning there is a growing risk that the technology could narrow society’s understanding of culture and beauty.

Moruzzi explains: “It’s unavoidable just for the fact that it these models can only know what they have been trained on, and they have mostly been trained on western concepts... The development of this technology comes from the west, and tech companies are mostly based in the western world. 

“But also, because there is a whole tradition of global art and culture that is not necessarily in a digital form, but is more community based and more embodied. It’s things like oral traditions – traditions that are not necessarily reproducible and cannot be used to train these models. And of course, this leads to the concern of the homogenisation of content and reproducing only a specific kind of and tradition of content.”

She compares the rise of generative AI to social media, warning that consumers could end up trapped in “echo chambers” of what is deemed as ‘good’ art.

Even if there is future legislation aimed at AI and copyright, by then the damage will have been done to the livelihoods of countless artists and cultural workers in Scotland and across the UK

Collette adds that AI’s inability to distinguish between true and false information could also distort the value of art itself, especially when works are created using misleading or inaccurate prompts.

“It would be a terrible thing for our society if we would have models that perpetuated those stereotypes. And these models are very capable of doing things like that,” she says. 

Moreover, high-profile campaigners such as Ed Newton-Rex, founder of Fairly Trained, have warned that the web crawlers that AI models rely on to generate content could soon have a profound impact on research. They fear it could accelerate the rise of paywalls and other barriers that limit access to internet sources. This means that from scientific researchers to journalists, those who depend on open access to information may find themselves increasingly shut out.

However, if the prevailing narrative of AI versus the artist is shifted, the technology can also hold significant potential for the creative industries. Back in parliament, Bondd acknowledges that, “in the long run,” using AI could work to his advantage. “If I become a big name, there’s a channel for the benefits to come back to me,” he says, suggesting that with the right protections in place, creators could stand to gain rather than lose.

Moruzzi adds: “It can really enable a whole community of people having access to tools for creation that they wouldn’t have otherwise, and even to expand their audiences by appealing to different kind of cultures and traditions. So it can definitely have good benefits.”

The evidence is extensive. In 2021, the University of Glasgow launched Our Heritage, Our Stories, a virtual national collection that uses AI to connect UK-wide community-generated historical content. Elsewhere, trials combining AI and eye-tracking tech are helping people with limited mobility create art. 

But until middle ground is reached, the long-term survival of the industry is hanging in the balance. While Moruzzi insists “humans will always seek human-created art”, she acknowledges it will become harder to recruit a new generation of artists.

She explains: “Those kinds of tasks that maybe creators do as soon as they graduate are already being taken over by this technology. And usually, the narrative is that AI will relieve us from the burdensome tasks that creators have to do. But these are the tasks that give a living to early career creators.”

For now, all eyes are on the AI Bill, which aims to close the regulatory gaps around the technology. Currently going through its second reading in the House of Lords, the legislation could reignite political tensions between the chambers.

But BD Owens, president of the Scottish Artists Union, worries it may be too late. “Even if there is future legislation aimed at AI and copyright, by then the damage will have been done to the livelihoods of countless artists and cultural workers in Scotland and across the UK.” 

Holyrood Newsletters

Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe

Get award-winning journalism delivered straight to your inbox

Get award-winning journalism delivered straight to your inbox

Subscribe

Popular reads
Back to top