female teen holding a mobile phone

NSPCC calls for stronger protections for children in private messaging services

Last updated:

The NSPCC is urging the UK Government to ensure children are better protected in private messaging environments, as recently published Home Office data reveals police forces across England and Wales recorded 38,685 child sexual abuse image offences last year (2023/24) – an average of more than 100 every day. In the South East of England, police forces recorded 6,043 child sexual abuse image offences in 2023/24- more than 16 every day. 

The police forces that made up this number were Hampshire and Isle of Wight Constabulary, Kent Police, Sussex Police, Surrey Police and Thames Valley Police.

A separate Freedom of Information request submitted by the NSPCC last year to police forces across the UK showed that where law enforcement recorded the platform used by perpetrators, exactly half (50%) took place on Snapchat and a quarter on Meta products – 11% on Instagram, 7% on Facebook and 6% on WhatsApp. 

Letter to Home Secretary and DSIT Secretary of State
In response, a joint letter from charities, including the NSPCC, Marie Collins Foundation, Lucy Faithfull Foundation, Centre of expertise on child sexual abuse, and Barnardo’s, has been sent to Home Secretary Yvette Cooper and Secretary of State for Science, Innovation, and Technology Peter Kyle.

The letter expresses collective concern regarding Ofcom’s final Illegal Harms Code of Practice published in December 2024. The charities argue that as it stands, children will not be protected from the worst forms of abuse on private messaging services under Ofcom’s plans, despite this being a core aim of the Online Safety Act.

Unacceptable loophole
Ofcom has stated that user-to-user services are only required to remove illegal content where it is ‘technically feasible’. This exception creates an unacceptable loophole, allowing some services to avoid delivering the most basic protections for children.

Data from police forces on the number of recorded offences where the platform was known indicates private messaging sites are involved in more crimes than any other type of platform, with perpetrators exploiting the secrecy offered by these spaces to harm children and go undetected.

The NSPCC wants the UK Government to push Ofcom to review and strengthen their most recent codes of practice on tackling this threat to children’s safety online.

The charity is also calling for private messaging services, including those using end-to-end encryption, to make sure there are robust safeguards in place to ensure their platforms do not act as a ‘safe haven’ for perpetrators of child sexual abuse. 

End-to-end encryption is a secure communication system where only communicating users can participate. This means that service providers can be blinded to child sexual abuse material being shared through their platform.

Messaging apps used in abuse crimes
Insight from Childline provides further evidence of how young people are being targeted or blackmailed to share child abuse images via the calculated use of private messaging apps.

During 2023/24, Childline delivered 903 counselling sessions to children and young people relating to blackmail or threats to expose or share sexual images online.  This was a 7% increase compared to 2022/23.

One girl, aged 13 years, said,

“I sent nude pics and videos to a stranger I met on Snapchat. I think he’s in his thirties. I don’t know what to do next. I told him I didn’t want to send him any more pictures and he started threatening me, telling me that he’ll post the pictures online.

“I’m feeling really angry with myself and lonely. I would like support from my friends, but I don’t want to talk to them about it as I’m worried about being judged.”

Sherwood: These offences cause tremendous harm and distress to children,
Chris Sherwood, NSPCC Chief Executive, said,

“It is deeply alarming to see thousands of child sexual abuse image crimes continue to be recorded by police in the South East of England. These offences cause tremendous harm and distress to children, with much of this illegal material being repeatedly shared and viewed online. It is an outrage that in 2025 we are still seeing a blatant disregard from tech companies to prevent this illegal content from proliferating on their sites.

“Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place. This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act.

“The Government must set out how they will take a bold stand against abuse on private messaging services and hold tech companies accountable for keeping children safe, even if it requires changes to the platform’s design – there can be no excuse for inaction or delay.”


News shared by Sophie on behalf of NSPCC. Ed