Audio By Carbonatix
The UK government will allow tech firms and child safety charities to proactively test artificial intelligence tools to make sure they cannot create child sexual abuse imagery.
An amendment to the Crime and Policing Bill announced on Wednesday would enable "authorised testers" to assess models for their ability to generate illegal child sexual abuse material (CSAM) prior to their release.
Technology Secretary Liz Kendall said the measures would "ensure AI systems can be made safe at the source" - though some campaigners argue more still needs to be done.
It comes as the Internet Watch Foundation (IWF) said the number of AI-related CSAM reports had doubled over the past year.
The charity, one of only a few in the world licensed to actively search for child abuse content online, said it had removed 426 pieces of reported material between January and October 2025.
This was up from 199 over the same period in 2024, it said.
Its chief executive, Kerry Smith , welcomed the government's proposals, saying they would build on its longstanding efforts to combat online CSAM.
"AI tools have made it so survivors can be victimised all over again with just a few clicks, giving criminals the ability to make potentially limitless amounts of sophisticated, photorealistic child sexual abuse material," she said.
"Today's announcement could be a vital step to make sure AI products are safe before they are released."
Rani Govender, policy manager for child safety online at the NSPCC, welcomed the measures for encouraging firms to have more accountability and scrutiny over their models and child safety.
"But to make a real difference for children, this cannot be optional," she said.
"Government must ensure that there is a mandatory duty for AI developers to use this provision so that safeguarding against child sexual abuse is an essential part of product design."
'Ensuring child safety'
The government said its proposed changes to the law would also equip AI developers and charities to make sure AI models have adequate safeguards around extreme pornography and non-consensual intimate images.
Child safety experts and organisations have frequently warned AI tools developed, in part, using huge volumes of wide-ranging online content are being used to create highly realistic abuse imagery of children or non-consenting adults.
Some, including the IWF and child safety charity Thorn, have said these risk jeopardising efforts to police such material by making it difficult to identify whether such content is real or AI-generated.
Researchers have suggested there is growing demand for these images online, particularly on the dark web, and that some are being created by children.
Earlier this year, the Home Office said the UK would be the first country in the world to make it illegal to possess, create or distribute AI tools designed to create child sexual abuse material (CSAM), with a punishment of up to five years in prison.
Ms Kendall said on Wednesday that "by empowering trusted organisations to scrutinise their AI models, we are ensuring child safety is designed into AI systems, not bolted on as an afterthought".
"We will not allow technological advancement to outpace our ability to keep children safe," she said.
Safeguarding Minister Jess Phillips said the measures would also "mean legitimate AI tools cannot be manipulated into creating vile material and more children will be protected from predators as a result".
Latest Stories
-
Three dead after helicopter crash in Hawaii
5 minutes -
People don’t report vote buying to Police – Prof Alhassan raises concern over enforcement gaps
10 minutes -
We’ve inadequately educated on democracy – Supt. Odartey
36 minutes -
US expects to end Iran operation in ‘weeks not months’, Rubio says after G7 meeting
40 minutes -
Photos: ‘Democracy Is Not For Sale’ forum in Tamale
41 minutes -
Gilgo Beach serial killings suspect to plead guilty, US media report
42 minutes -
Proving vote buying requires hard evidence – Richard Odartey
46 minutes -
Postpartum depression is a medical condition, not laziness – Counselor Perfect
1 hour -
Democracy in Ghana now a commodity, driven by vote buying and poverty – Prof Alhassan
1 hour -
Politicians capitalise on poverty to monetise democracy – Prof Alhassan
1 hour -
Time is Justice: Breaking the Chains of Courtroom Delays
1 hour -
JoyPrime’s Edith Agbeli shines at Africa’s 100 Most Influential Women Awards
1 hour -
BlowChem supports National Chief Imam with donation to mark 2026 Eid al-Fitr
2 hours -
Livestream: ‘Democracy Is Not For Sale’ forum underway
2 hours -
“If your education system is not about critical thinking, you are not developing innovators” – Adutwum
2 hours
