Artificial intelligence (AI) has put child sexual abuse “on steroids”, the Home Secretary has warned ahead of a crackdown on computer-generated material.
Yvette Cooper said a new ban on AI tools which can create child sexual abuse material – including “sexualised” pictures of real children – would give law enforcers the power to “keep more children safe”.
Owning an AI tool which can generate these pictures could land offenders with a five-year prison sentence.
Users found to own AI-made “paedophile manuals” could also face up to three years in prison under measures the Government has proposed as part of the Crime and Policing Bill.
Ms Cooper told Sky News’s Sunday Morning With Trevor Phillips programme: “This is a real, disturbing phenomenon that we’ve got where we’ve known for some time the online child sexual abuse material is growing, but also the grooming of children, teenagers online.
“What’s now happening is that AI is putting this on steroids and it is making it easier for perpetrators, for abusers, to groom children, and it’s also meaning that they are manipulating images of children and then using them to draw and to blackmail young people into further abuse.
“It’s just the most vile of crimes.
“So, what we need to do is to strengthen the law, and that includes banning some of the AI models being used for child abuse, but also banning some of the paedophile manuals.”
The Home Secretary added the National Crime Agency (NCA), which investigates cybercrime and cross-border economic offences, “are saying these further powers are needed and they will then be able to use them to get prosecutions to keep more children safe”.
Fake images are being used to blackmail children and force them to livestream further abuse, according to the Home Office, and ministers fear online abuse can lead viewers to offend in person.
Ms Cooper said: “Very often they’re using images of real children and then abusing them, manipulating them and making them sexualised.
“These are being circulated then in these huge forums and what the NCA will say is this is drawing more perpetrators into more extreme and more sadistic abuse.”
She later added: “I’m really worried that people think, ‘oh, well, it’s AI, we shouldn’t take it seriously’.
“Actually, the evidence is that what it’s doing is it’s actually escalating, accelerating, the abuse, and that’s why the laws that we’re bringing in – this is world-leading, other countries are not yet doing this, but I hope everyone else will follow.”
The Bill will also introduce a specific offence for paedophiles who run websites to share child sex abuse which could carry a 10-year prison sentence.
We need your consent to load this Social Media content. We use a number of different Social Media outlets to manage extra content that can set cookies on your device and collect data about your activity.
The law reforms come after warnings from the Internet Watch Foundation (IWF) that more and more sexual abuse images of children are being created.
The charity’s latest data shows reports of AI-generated child sexual abuse images have risen by 380%, with 245 confirmed reports in 2024, compared with 51 in 2023.
Each of these reports can contain thousands of images.
Some of the AI-generated content is so realistic that it is sometimes difficult to tell the difference between what is real abuse and what is fake, the charity said.
Derek Ray-Hill, IWF interim chief executive, said the steps “will have a concrete impact on online safety”.
He added: “The frightening speed with which AI imagery has become indistinguishable from photographic abuse has shown the need for legislation to keep pace with new technologies.
“Children who have suffered sexual abuse in the past are now being made victims all over again, with images of their abuse being commodified to train AI models.
“It is a nightmare scenario and any child can now be made a victim, with life-like images of them being sexually abused obtainable with only a few prompts and a few clicks.”
Lynn Perry, chief executive of Barnardo’s children’s charity, said: “We welcome the Government taking action to tackle the increase in AI-produced child sexual abuse imagery which normalises the abuse of children, putting more of them at risk, both on and offline.
“It is vital that legislation keeps up with technological advances to prevent these horrific crimes.
“Tech companies must make sure their platforms are safe for children. They need to take action to introduce stronger safeguards and Ofcom must ensure that the Online Safety Act is implemented effectively and robustly.”