
- New data released today, by the child protection charity Lucy Faithfull Foundation, reveals that over the past two years warning messages on online platforms have been triggered more than 70 million times by users seeking sexual images of children online – more than 95,000 per day.1
- Through Project Intercept, an innovative collaboration between the charity and major technology platforms such as Mega, Google, and Tik Tok, almost 700,000 people2 who were engaging in harmful or risky online behaviours sought support to change their behaviour from the charity’s Stop It Now service. Warning messages are active on a range of online platforms, including end-to-end encryption and AI chatbot spaces.
- The charity urges more tech companies to integrate behaviour-change solutions into their platform design, putting friction in the pathways of potential offenders, making consequences clear, and offering a clear route to help to change their behaviour through the charity’s Stop It Now services.
Data released today by the UK-wide child protection charity Lucy Faithfull Foundation reveals the impact of effective partnerships between technology companies and specialist NGOs, as warning messages intercept millions seeking illegal sexual images of children online.
The collaborations have resulted in 70 million warning messages being served to people who have used search terms that indicated they were looking for sexual images of children or clicked links that have previously been reported as containing sexual images of children.
OpenAI, Meta, TikTok, Mega, Stability AI, Civitai, BT, MovieStarPlanet, Bing, Google, Aylo, and OnlyFans are among the tech companies who have deployed warning messages, operating successfully in a range of spaces – including in end-to-end encrypted environments, a traditionally difficult area to enforce child safety.
Warning messages are active on platforms across 131 countries and in a range of sectors including file-sharing, search, social media, dating and pornography platforms. After being served a warning almost 700,000 people have clicked through to support offered by the charity to change their behaviour, through the charity’s Stop It Now service.3
Warnings emphasise the illegality of viewing sexual images of under 18s, offer an option to report suspected illegal images, and crucially, signpost to support offered through the charity’s Stop It Now service. Stop It Now offers support to anyone with concerns about child sexual abuse, including for those concerned about their own sexual thoughts, feelings or behaviour towards children. This includes online self-help modules to help those at risk of offending change and manage their behaviour.
Each month, in 2024 and 2025, an average of 28,000 users were redirected to the Stop It Now self-help resources after being served a warning, with an 81% engagement rate4 – meaning the vast majority of users stayed on the site to interact with the resources. Project Intercept is funded by Nominet’s Countering Online Harms Innovation Fund, and was developed in response to the growing numbers of people viewing sexual images, and the wide accessibility of illegal content.
Deborah Denis, chief executive at Lucy Faithfull Foundation says: “Tens of millions of people have been reached through just 22 targeted interventions on tech platforms. That makes one thing clear – the potential to scale this approach is enormous. By placing more warnings across more online spaces, we can disrupt harmful behaviour at the moment it’s happening and prevent countless children from being harmed. The need has never been more urgent, particularly as new AI technologies accelerate the spread of online child sexual abuse.
“Through our Stop It Now services, we hear every day from people who are concerned about the direction of their online behaviour, but who struggle to change without support. Meanwhile, children are harmed every day. We must not leave children to protect themselves; it is the responsibility of adults, and especially tech companies, to keep them safe.
“We’ve proved it’s possible for tech companies to intervene effectively, preventing people from seeking illegal and abusive images of children and diverting them towards help to change. These warnings don’t simply block harmful content; they offer a clear pathway to support and behaviour change. Now we are calling on tech companies to work with us to scale what works and prevent harm before it happens.”
Someone who knows the value of support to change behaviour is Ben,5 who accessed the Stop It Now self-help modules after his searches on a pornography website triggered a warning message. Ben says: “After I got the warning on an adult site I was browsing, I had a look around the Stop It Now modules.
Almudena Lara, Ofcom’s Child Protection Policy Director, says: “These findings highlight the important role that warning messages play in tackling child sexual abuse online, by encouraging offenders to change their behaviour. This is a core safety measure in our codes of practice, which was developed with the help of evidence submitted by the Lucy Faithfull Foundation.
“It’s encouraging to see many platforms deploying deterrence messaging as the result of effective partnerships between child protection experts and tech firms, and we’re pleased that some services appear to be going beyond what we expect.
“Research like this is crucial to us better understanding how effective this measure can be. However, it also underlines the scale of the problem that needs to be addressed. Change is happening, but there’s more to do, and we’ll continue to collaborate with a range of organisations to create a safer life online and hold platforms to account.”
Griffin Hunt, Product Manager for Google Search says: “Early in 2025, we transformed our CSAM deterrence messaging in Google Search to incorporate a public health perspective and to encourage help-seeking by people at risk of perpetrating child sexual exploitation and abuse. Since making these changes, we have observed greater engagement with the therapeutic help services we feature — like Stop It Now! — and a reduction in the rate at which users issue follow-on CSAM-seeking queries in Search.”
André Meister, Chief Technology Officer at Mega says: “There’s a common misconception that encrypted services are limited in how they can address harmful activity. In practice, we can and do take meaningful action, and one of several measures we’ve implemented has been in partnership with Lucy Faithfull Foundation’s Project Intercept.
“We recognised that it isn’t enough to reactively or even proactively remove material, we also needed to intervene earlier in the path towards harmful behaviour, before patterns become entrenched. Through our work with Project Intercept we are delivering well-timed deterrence messaging and self-help resourcing that interrupts harmful behaviour right at the point of intent, and we are pleased with the level of engagement the intervention has been driving. It is a more complete, science-backed approach, and we are grateful for the partnership in our continued fight against CSAM.”
Paul Fletcher, CEO of Nominet says: “Project Intercept reflects the best of what can be achieved when organisations come together to protect children. Together, we’ve built an intervention that reaches people at critical moments and guides them toward support before harm can happen. The results underline just how impactful this approach can be – it’s difficult work but it matters, and we’re proud of the progress so far. We remain committed to funding solutions that make the internet safer and to partnerships that deliver lasting change.”
Project Intercept is keen to hear from new potential technology partners. The charity urges technology companies to work with the Foundation’s clinical experts to implement offender-facing warning messages.
ENDS
For further information, and to discuss interview opportunities with a spokesperson from Lucy Faithfull Foundation please contact:
Abigail Parry, Communications Manager: +44 7512 334961 / press@lucyfaithfull.org.uk
Notes to editors
1 Over a two-year period, 70 million total triggered warnings averages at 95,890 warnings triggered per day.
2 682,730 people who triggered a warning message clicked through to Stop It Now.
3 Stop It Now is an anonymous and confidential service for anyone concerned about child sexual abuse. An independent evaluation of the Stop It Now helpline found clear demand for the service and evidence that it provides quality, cost-effective advice that can prompt behaviour change and strengthen protective factors that reduce the risk of harm. NatCen Social Research (2014), A public health approach to tackling child sexual abuse: Research on Stop It Now UK and Ireland, funded by the Daphne III programme of the European Union. A further independent evaluation of the Stop It Now helpline is currently underway and will be published later in 2026.
4 Engagement rate as defined by Google Analytics as the percentage of engaged sessions.
5 Ben is a pseudonym.
About Lucy Faithfull Foundation
Lucy Faithfull Foundation is a child protection charity that works to prevent child sexual abuse. Our main area of work is with people who pose a risk to children to divert them from causing harm. We also support individuals and families who have been affected by abuse and help professionals who work with families to create safer environments for children.
About Stop It Now
Stop It Now is a national campaign, website, and helpline run by the child protection charity Lucy Faithfull Foundation, offering anonymous support to protect children to anyone concerned about their own or someone else’s sexual thoughts or behaviour towards children. Stop It Now works with the government and child protection agencies, to promote public education and prevent child sexual abuse. The Stop It Now helpline has been operating since 2002 and aims to prevent child abuse by encouraging people who are offending or at risk of doing so to seek help, and by giving adults the information they need to protect children safely. It is run by an experienced team of trained advisors – including former probation officers, social workers, psychologists and ex-police officers.
Since 2002, the helpline has provided advice and support to over 87,000 people. More than half of contact comes from people concerned about their own behaviour.
Lucy Faithfull Foundation also runs Shore, a website that provides an anonymous safe space for teenagers worried about their own or someone else’s sexual behaviour to ask questions and get advice: www.shorespace.org.uk
Facebook: @LucyFaithfullFoundation
X (Twitter): @Lucy_Faithfull_
LinkedIn: @Lucy Faithfull Foundation
Instagram: @lucyfaithfullfoundation