Coroner’s report urges social media changes

A coroner has written to social media companies and the government, calling for action following the inquest into the death of schoolgirl Molly Russell.

The 14-year-old from Harrow ended her life in November 2017 after viewing content online about suicide and self-harm.

Coroner Andrew Walker issued six recommendations, including separating adult and children’s platforms and reviewing the algorithms used by websites.

Molly’s father Ian urged social media companies not to “hesitate”.

At the inquest held at North London Coroner’s Court last month, the coroner concluded that the schoolgirl died while suffering from the “negative effects of online content”.

In a report on preventing future deaths sent to the likes of Meta, Pinterest, Twitter and Snapchat, and the UK government on Thursday, Mr Walker listed these concerns and points of action for media companies and the government to take into account:

  • Separate platforms for adults and children

  • Age verification before joining a platform

  • Provision of age-specific content

  • Review the use of algorithms to serve content

  • Government to review the use of advertising

  • Parental, Guardian or Guardian Controls, including Access to and Retention of Material Viewed by a Child

The coroner also raised concerns about age verification when logging into the platforms, non-age verification of content, and the use of algorithms to serve content alongside advertising.

Other issues included the lack of access or control for parents and guardians, and the inability to link a child’s account to a parent’s or guardian’s account.

In his report, Mr Walker said: “I recommend considering the establishment of an independent supervisory authority to monitor the content of online platforms.”

READ :  Cannibalism, pedophilia: Social media fake news roils Brazil ahead of crucial vote

He also recommended that “legislation necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content should be considered”.

The coroner added that while any regulation “would be a matter for the government, I see no reason why the platforms themselves would not want to consider self-regulation.”

“I believe you and/or your organization have the power to take such action,” he wrote.

Meta, Pinterest, Twitter and Snapchat now all have 56 days to respond with a timeline for proposed actions or to explain why no actions are being proposed.

Molly’s father Ian Russell said he welcomed the coroner’s report and urged social media companies to “heed the coroner’s words and not wait for laws and regulations.”

“They should think carefully about whether their platforms are suitable for young people at all.”

He added that the government “also needs to act urgently to introduce its strict regulation of social media platforms to ensure children are protected from the effects of harmful online content, and that platforms and their senior managers face tough sanctions.” if they fail to do so, take steps to curb algorithmic amplification of destructive and highly dangerous content or fail to remove it quickly”.

Representatives from both Meta, Instagram’s parent company, and Pinterest gave testimony during the investigation.

Meta executive Elizabeth Lagone said she believes posts Molly has seen that her family says “encouraged suicide” are safe, while Pinterest’s Judson Hoffman told the investigation the site was “not safe” than that schoolgirl used them.

READ :  Israel's persecution of Palestinian social media activists, criticized by human rights groups

In response to the report, Meta agreed that “regulation is needed” and it is “reviewing” the recommendations.

“We will continue to work hard, in collaboration with experts, teenagers and parents, so that we can continue to improve,” the company added.

In a statement, Pinterest said it is “committed to making continuous improvements to ensure the platform is safe for everyone and the coroner’s report is carefully reviewed.”