
The UK government is working on four new laws to target people who use AI tools to generate child sexual abuse material (CSAM). According to its Home Office, the laws will make it illegal to create, possess, or distribute AI tools designed to generate CSAM, making the UK the first country to do so.
Those found guilty may face prison time of up to five years. The new laws will also criminalize possession of AI paedophile manuals that tell people how to use AI to abuse young people sexually, inviting up to three years in prison. It will be unlawful to run websites that host CSAM or offer advice on how to groom children, with up to 10 years of prison time.
Furthermore, the country's Border Force will get powers to check for potential CSAM on the digital devices of suspected child abusers when they try to enter the UK. This could invite a prison time of up to three years, depending on the severity of the content.
"We know that sick predators' activities online often lead to them carrying out the most horrific abuse in person. This government will not hesitate to act to ensure the safety of children online by ensuring our laws keep pace with the latest threats," Home Secretary Yvette Cooper said.
AI-generated CSAM can include content that is wholly or partly computer-generated. It can have real images edited to portray someone else by replacing their face or content that includes real-life voices of children.
Data from the National Crime Agency (NCA) reveals concerning insights that it makes about 800 arrests each month related to child abuse threats online. BBC reports that about 840,000 adults, or 1.6% of the population, pose a threat to children in the UK (both online and offline), as per the agency.
Organizations working against child abuse suggest the government can take further action. Internet Watch Foundation (IWF) calls for the ban of “nudifying” apps, which are used to create images of both adults and children.
“The frightening speed with which AI imagery has become indistinguishable from photographic abuse has shown the need for legislation to keep pace with new technologies," said IWF's interim chief executive Derek Ray-Hill.
"The availability of this AI content further fuels sexual violence against children. It emboldens and encourages abusers, and it makes real children less safe."
While creating or possessing AI-generated CSAM is already illegal, IWF notes that the new offence outlaws AI models optimized to "create the most severe forms of child sexual abuse material."
The charity organization was among the first to sound the alarm around AI and synthetic CSAM. It has seen an increase in the amount of AI-generated CSAM found online, noting that many of the 245 reports of such imagery in 2024 were "so realistic it had to be treated exactly the same as ‘real’."
These new laws will be included in the upcoming Crime and Policing Bill and are part of the UK government's overall effort around child safety online. Ofcom, UK's digital regulator, recently announced that adult websites must include age verification checks before giving access to the users. The regulator also asked tech firms to complete an assessment of risks posed by illegal content on their platforms to children and adults by March 16.
Source: BBC
6 Comments - Add comment