Child sexual abuse image crimes logged by police rise by 25% says NSPCC

Friday 1 March 2024

The number of child sexual abuse image crimes recorded by UK police increased by a quarter last year, according to new data obtained by children’s charity the NSPCC.

The charity’s analysis of data gathered from 35 police forces through Freedom of Information requests found that 33,000 offences where abuse images were collected and distributed were logged by police during 2022/23.

The NSPCC said this figure was also up 79% since 2017/18, when the charity first called for online safety laws, and that 160,000 crimes had been recorded since then.

Where the platform involved was discussed by police, the NSPCC said Snapchat was flagged in almost half of the cases, with the Meta-owned platforms of Facebook, Instagram and WhatsApp making up a quarter.

The charity warned Meta about its plans to roll out end-to-end encryption across its messaging platforms.

NSPCC chief executive Sir Peter Wanless said the charity wants online safety regulator Ofcom to “act with greater ambition” with regards to the enforcement of the Online Safety Act, pushing tech companies to go even further than they will currently be required to under the new laws to further clamp down on child sexual abuse.

Police left children ‘at mercy’ of Rochdale grooming gang, says new report

Portsmouth sports coach jailed for filming teenage boys in showers and supplying the videos to paedophile in Australia

Police officer who incited 200 young girls to send him degrading pictures and videos of themselves over Snapchat has been jailed

The charity has also repeated its warning about Meta’s plans to roll out end-to-end encryption across its messaging platforms, claiming it will prevent authorities from identifying offenders and safeguarding victims.

It called on Meta to pause its plans until the social media giant could prove child safety will not be compromised.

Meta has been contacted for comment.

In response to the NSPCC figures, a Snapchat spokesperson said: “Child sexual abuse is horrific and has no place on Snapchat.

“We use cutting-edge detection technology to find and remove this type of content, and work with police to support their investigations.

“Snapchat also has extra safety features for 13 to 17-year-olds, including pop-up warnings if they’re contacted by someone they don’t know.”

Advertisement

Advertisement

Advertisement