Online grooming has reached record levels in the UK with the annual number of offences exceeding 7,000 for the first time, the NSPCC has said.
New data compiled by the children’s charity reveals that 7,062 offences for sexual communication with a child were recorded by police last year – an 89 per cent increase since 2018.
Girls are the main target for online grooming making up 81 per cent of cases in 2023 to 2024, the National Society for the Prevention of Cruelty to Children said.Â
Shockingly, the youngest victim of online grooming last year was a boy aged just five.
Meanwhile, parents have been warned that Snapchat is the preferred platform for predators with the messaging app used in almost half of cases.
Girls are the main target for online grooming making up 81 per cent of cases in 2023 to 2024, the National Society for the Prevention of Cruelty to Children said (stock image)Â
While Snapchat was used in 48 per cent of cases, WhatsApp figured in 12 per cent, Facebook in ten per cent and Instagram in six per cent, according to the findings
While Snapchat was used in 48 per cent of cases, WhatsApp figured in 12 per cent, Facebook in ten per cent and Instagram in six per cent, according to the findings.
Video games and online chat rooms are also used by predators to groom children before they are encouraged to continue talking on encrypted messaging apps where abuse can proceed undetected.
The findings come as a man who abused at least 70 children online and drove one to suicide was last week sentenced to at least 20 years in a jail following one of the biggest ‘catfishing’ investigations in the world.
Prosecutors said Alexander McCartney, 26, had targeted about 3,500 children aged between 10 and 16 in the UK, Europe, Australia and New Zealand and had caused ‘catastrophic damage’ to young girls all over the world.
Safeguarding minister Jess Phillips described online grooming and child sexual abuse is a ‘vile crime that inflicts long lasting trauma on victims’.
Ms Phillips said that the Online Safety Act will force tech giants to crack down on this behaviour, adding: ‘Social media companies have a responsibility to stop this vile abuse from happening on their platforms.’
The Online Safety Act, passed a year ago, will mean tech giants face multi-million pound fines if they fail to protect users from harmful content.
However stricter controls on social media firms will not come into force for several months, as regulator Ofcom is still consulting on its codes of practice and guidance.
In response to its findings the NSPCC called on Ofcom to strengthen the rules that social media firms must follow to tackle child sexual abuse on their platforms.
Prosecutors said Alexander McCartney (pictured), 26, had targeted about 3,500 children aged between 10 and 16 in the UK, Europe, Australia and New Zealand and had caused ‘catastrophic damage’ to young girls all over the world
Safeguarding minister Jess Phillips described online grooming and child sexual abuse is a ‘vile crime that inflicts long lasting trauma on victims’
The charity urged the regulator and the Government to change focus from acting after children have been harmed to a proactive approach that would ensure features built in to social media apps are not contributing to abuse.
Sir Peter Wanless, NSPCC chief executive, said: ‘One year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children.
‘We need ambitious regulation by Ofcom who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.
‘It is clear that much of this abuse is taking place in private messaging which is why we also need the Government strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.’
Becky Riggs, the National Police Chiefs’ Council lead for child protection and abuse investigations, described the NSPCC’s findings as ‘shocking’.
She added: ‘It is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow.
‘Policing will not stop in its fight against those who commit these horrific crimes. We cannot do this alone, so while we continue to pursue and prosecute those who abuse and exploit children, we repeat our call for more to be done by companies in this space.’
Snapchat was approached for comment.