female holding a mobile phone

52% of online child sex offences took place on Facebook-owned apps,

Facebook’s apps were used in more than half of online child sex crimes, new NSPCC data reveals as the charity calls on the Government to deliver meaningful change in the Online Safety Bill to tackle the biggest threat to children online.

In a single year, police recorded more than 9,470 instances where the means of communication was known in reports of child sex abuse image and online child sex offences – 52 per cent of which took place on Facebook-owned apps. 

Instagram was used more than any other Facebook-owned platform, in over a third of all instances.

Meanwhile, Facebook and Messenger were used in a further 13 per cent, according to the data obtained from 35 police forces in England, Wales and the Channel Islands by the NSPCC via Freedom of Information requests. 

Fears over end-to-end encryption
The charity fears many of these could go unreported if Facebook proceeds with end-to-end encryption without necessary safeguards in place.

They urge the Government to give Ofcom the power to take early and meaningful action against firms whose dangerous design choices put children at risk.

It is crucial private messaging is in scope of legislation, but the charity argues the current plans for it, released by the Government in December, need to be strengthened. 

Will hinder ability to identify and disrupt child abuse
End-to-end encryption offers a range of benefits, but child protection experts, law enforcement worldwide and Facebook themselves have said it will hinder their ability to identify and disrupt child abuse on their services.  

The NSPCC has repeatedly demanded that it should only be rolled out if and when platforms can demonstrate it won’t compromise children’s safety. 

What’s App
The issue is brought into sharp focus by WhatsApp which accounts for one in ten instances recorded by police where Facebook’s apps were involved in online child sexual abuse, according to the new data. 

But last year they only make up 1.3 per cent of child abuse tip-offs from Facebook to the NCA because they can’t see the content of messages to report abuse. 

A major source of risk
Private messaging is a major source of risk as it is the most common avenue for abusers to contact children.

Last month, the Office for National Statistics revealed children are contacted via direct message in nearly three quarters of cases when they are approached by someone they don’t know online. 

Burrows: Facebook willingly turning back the clock
Andy Burrows, NSPCC Head of Child Safety Online Policy, said,

“Facebook is willingly turning back the clock on children’s safety by pushing ahead with end-to-end encryption despite repeated warnings that their apps will facilitate more serious abuse more often. 

“This underlines exactly why Oliver Dowden must introduce a truly landmark Online Safety Bill that makes sure child protection is no longer a choice for tech firms and resets industry standards in favour of children. 

“If legislation is going to deliver meaningful change it needs to be strengthened to decisively tackle abuse in private messaging, one of the biggest threats to children online.” 

Calling on the Government
The NSPCC are calling on the Government to: 

  • Shift the onus to be on tech firms to show they are identifying and mitigating risks on products before they roll them out. The current onus is for Ofcom to prove risk, rather than companies to show they are taking steps to protect children. But they won’t be able to do this with end-to-end encryption in place because the majority of child abuse reports will disappear, creating a catch 22 for the regulator.  
     
  • Give Ofcom the power to force tech firms to act before harm has happened rather than after. Under the current plans, the regulator needs to demonstrate persistent and prevalent child abuse before it can force platforms to act. But the thrust of the legislation should be to catch harm at the earliest stage to prevent it.  
     
  • Make Ofcom consider the interplay of risky design features to see if it’s likely to exacerbate risk. End-to-end encryption is likely to present particularly severe risks if it can be exploited by abusers in conjunction with other high-risk design choices, for example algorithmic friend suggestions and livestreaming functionality. This is why the NSPCC is particularly concerned about the proposals to introduce end-to-end encryption on Facebook.   

Earlier this month, the NSPCC’s published its ‘Delivering a Duty of Care’ report, which assessed plans for legislation against its six tests to achieve bold and lasting protections for children online. 

The NSPCC has been the leading voice for social media regulation and the charity set out detailed proposals for a Bill in 2019. The draft Bill is expected in the Spring.  


News shared by Sophie on behalf of NSPCC. Ed

Image: kaboompics under CC BY 2.0