The NSPCC says the numbers are “incredibly alarming but represent only the tip of the iceberg” of what children are experiencing online.
By Rachael Venables, News Correspondent
Wednesday 22 February 2023 02:31, UK
According to police data, pedophiles are beginning to use virtual reality headsets to view images of child abuse.
The use of this technology was recorded in eight cases in 2021/22 – the first time this technology was explicitly mentioned in crime reports.
During that period, police recorded 30,925 offenses involving obscene images of children – the highest number recorded by armed forces in England and Wales.
Of these, 9,888 times a social media or gaming site was captured – including 4,293 times Snapchat, 1,361 times Facebook, 1,363 times Instagram and 547 times WhatsApp.
NSPCC, which compiled the data, is calling for a series of changes to online safety law to prevent more children from being subjected to abuse.
Sir Peter Wanless, Chief Executive of the NSPCC, said: “These new figures are incredibly alarming but represent only the tip of the iceberg of what children are experiencing online.
“We hear from young people who feel powerless and abandoned as online sexual abuse threatens to become the norm for a generation of children.
“By creating a Child Safety Advocate to advocate for children and families, the government can ensure that the Online Safety Act systematically prevents abuse.”
Continue reading:
NSPCC’s Childline reports a 45 percent increase in the number of boys experiencing online sexual abuse
Child Abuse Investigation: Turning a blind eye should be against the law
Mom reveals trauma of court delays after ex-husband sexually abused daughter
The NSPCC also wants a change in the law that would result in senior managers of social media sites being held criminally responsible for child abuse.
Sir Peter said: “It would be inexcusable if five years from now we were still catching up on the pervasive abuse that was allowed to proliferate on social media.”
A government spokesman said: “The protection of children is at the heart of the Online Safety Act and we have put in place strong, world-leading measures to achieve this goal while ensuring that the interests of children and families are represented through the Children’s Ombudsman.
“Virtual reality platforms are within reach and will be forced to protect children from exploitation and remove heinous content about child abuse.
“If companies don’t effectively address this material, they could face hefty fines and possible criminal penalties against their executives.”
A spokesman for Meta – which owns Facebook, Instagram and WhatsApp – said: “This horrific content is banned from our apps and we report cases of child sexual exploitation to the NCMEC (National Center for Missing & Exploited Children).
“We are an industry leader in the development and use of technology to prevent and remove this content, and we are working with law enforcement, child safety experts and industry partners to address this societal issue.
“Our work in this space is never finished and we will continue to do everything we can to keep this content off our apps.”
A Snapchat spokesman said: “All child sexual abuse is despicable and illegal. Snap has dedicated teams around the world working closely with law enforcement, experts and industry partners to combat it.
“If we proactively discover or are made aware of sexual content exploiting minors, we will remove it immediately, delete the account and report the perpetrator to the authorities. Snapchat has additional safeguards in place that make it harder for younger users to be discovered and contacted by strangers.”
“I had no control”
Roxy Longworth was 13 when a 17-year-old boy she didn’t know contacted her on Facebook before forcing her to send pictures via Snapchat.
She said it made her feel isolated and filled with guilt, and soon a friend of his began using the images to push for more explicit imagery.
“My whole life has been about doing what he told me to do and hiding it from everyone,” Roxy said. “And the more photos he had, the more he had to blackmail me until he finally asked me to send a video. He and his friend, they just totally owned me at that point, I had no control.”
It had a devastating effect on her mental health.
“The shame about it buried me,” she said. “I ended up getting very sick. I hurt myself many times, I stopped sleeping and eventually I was hospitalized with a psychotic episode. I was on suicide watch for about a year.”
She has written a book called When You Lose It to delve into what happened, but says it’s still haunting to know the photos exist.
Roxy added: “It’s like a creeping feeling that you try to forget and then you realize those photos are still out there.
“They were in group chats with hundreds of people, they were everywhere.
“And the thing is – these photos are of a 13-year-old girl. This is so messed up. That’s disgusting.”
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email [email protected] in the UK.