Menu
Subscribe to Holyrood updates

Newsletter sign-up

Subscribe

Follow us

Scotland’s fortnightly political & current affairs magazine

Subscribe

Subscribe to Holyrood
by Ethan Claridge
13 November 2025
UK Government introduces law to tackle AI child abuse images

Reports of AI-generated child sexual abuse material more than doubled in the past year | Alamy

UK Government introduces law to tackle AI child abuse images

The UK Government has released new legislation that allows them to work with the artificial intelligence (AI) industry and child protection organisations to ensure AI models cannot be misused to create synthetic child sexual abuse content.

The legislation comes after reports of AI-generated child sexual abuse material more than doubled in the past year, rising from 199 in 2024 to 426 in 2025, according to the Internet Watch Foundation (IWF). There has also been a disturbing rise in depictions of infants, with images of 0–2-year-olds surging from five in 2024 to 92 in 2025.  

“We must make sure children are kept safe online and that our laws keep up with the latest threats,” said Jess Phillips, minister for safeguarding and violence against women and girls. “This new measure will mean legitimate AI tools cannot be manipulated into creating vile material and more children will be protected from predators as a result.”

Under the new legislation, designated bodies like AI developers and child protection organisations, such as the IWF, will be allowed to scrutinise AI models for illegal material. This is to ensure safeguards are in place to prevent them generating or proliferating child sexual abuse material, including indecent images and videos of children.

Currently, developers can be held criminally liable if they create and possess this kind of material, even if it is intended to carry out safety testing on AI models. Because of this, images can only be removed after they have been created and shared online. This measure will enable designated bodies to test an AI system’s safeguards from the start, ensuring that these systems will be incapable of producing child sexual abuse material.

Technology Secretary Liz Kendall said: “These new laws will ensure AI systems can be made safe at the source, preventing vulnerabilities that could put children at risk. By empowering trusted organisations to scrutinise their AI models, we are ensuring child safety is designed into AI systems, not bolted on as an afterthought.”

To ensure that the testing work is carried out safely and securely, the government intends to bring together a group of experts in AI and child safety for oversight. The group will ensure that sensitive data is protected, prevent any risk of illegal child sexual abuse content being leaked and support the wellbeing of researchers involved, who could be emotionally affected by the testing process.  

Data from the IWF also shows the severity of the material identified online has intensified over the past year. The number of incidents of category A content, defined as images involving penetrative sexual activity, images involving sexual activity with an animal, or sadism, rose from 2,621 to 3,086 items. These images now account for 56 per cent of all illegal material reported, compared to 41 per cent last year. Girls have been overwhelmingly targeted by this content, making up 94 per cent of illegal AI images in 2025.

“AI tools have made it so survivors can be victimised all over again with just a few clicks, giving criminals the ability to make potentially limitless amounts of sophisticated, photorealistic child sexual abuse material,” said Kerry Smith, chief executive of the IWF. “For three decades, we have been at the forefront of preventing the spread of this imagery online – we look forward to using our expertise to help further the fight against this new threat.”

Holyrood Newsletters

Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe

Get award-winning journalism delivered straight to your inbox

Get award-winning journalism delivered straight to your inbox

Subscribe

Popular reads
Back to top