Jail terms for social media bosses proven to run “unsafe” businesses would help platforms think about user safety, Molly Russell’s father said.
Ian Russell said action that would “realign” executives’ minds was needed to prevent them from “prioritizing profit”.
Speaking to the PA news agency after a coroner suggested child and adult platforms should be separated following the inquest into Molly’s death, Mr Russell said he understood the reasons for the conclusion but added: “It’s going to be quite difficult to implement them.”
Coroner Andrew Walker sent a Future Death Prevention (PFD) report to companies including Meta, Pinterest, Twitter and Snapchat, as well as the UK government, urging a review of the algorithms used by the sites to deliver content.
“I can’t wait to see the answers,” said Mr. Russell.
“It will be very revealing, especially when the way the platforms react to the coroner’s report on preventing future deaths, to see how they are proposing to make their platforms safer for users – especially young users.”
Molly, 14, from Harrow, north west London, ended her life in November 2017 after viewing suicide and self-harm content online, prompting her family to advocate for better internet safety.
On the suggestion to have separate platforms for children and adults, Mr Russell said: “I can understand why the coroner suggested this because in our offline world we separate children from harm.
“We don’t allow them to buy alcohol until they’re 18 and they can’t buy a sharp knife for similar reasons.
“So I can understand the reasoning behind it – I think in practice, as I understand it, it’s going to be quite difficult to implement because the platforms are already struggling to know the age of the people who are on their platform.” condition.
“Until you know for sure how old the users of your platforms are, you cannot separate these two target groups.
“If you can’t separate them, there’s always a risk that you’re putting the wrong person in the wrong half of that separation.”
Mr Russell continued: “I think age verification is crucial but I don’t think age verification has come of age, if I do say so myself.
“The technologies are improving, but at the moment they are not well implemented, especially on large global platforms.
“I’m sure that will improve fairly quickly over time because technology is moving so quickly, and if you can accurately verify or confirm a person’s age … then maybe you can start taking steps to.” provide different content for different age groups.”
Reflecting on what he had heard from executives at Meta and Pinterest at the request of his daughter, Mr Russell said: “It became a two-platform story.
“The way Pinterest reacted seemed genuine, they seemed to want to improve the security of their platform.
“If you look at Meta, they looked at it from the other company’s perspective — they came with the corporate hat to deny blame whenever possible.
“From time to time I’ve shown people with warning some of the safer content I’ve found on Molly’s social media accounts and I don’t know anyone who thinks it’s remotely safe or has enjoyed watching it or was not shocked by what they saw.
“The only person I’ve met in the entire world since Molly’s death who thought the contents were safe was Meta’s Elizabeth Lagone.”
Looking back on his involvement in the Safer Online Campaign, Mr. Russell said: “I don’t see what I do as a campaign. I see what I do as speaking truths I have found in the way young people use social media and the harmful content they are exposed to when they use social media.
“We learned this in the most horrible and heartfelt way — we lost our youngest daughter and we will never really get over it.”
Referring to the Online Safety Act, Mr Russell said it will “absolutely take time” to be implemented but said he was concerned by the rhetoric about how freedom of expression could be compromised.
“What is incomprehensible is when the bill has been examined so thoroughly and drafted so carefully and new objections are raised,” he said.
“There is talk that the law may be some form of censorship that will affect freedom of expression – obviously it would be wrong for that to happen.
“But a bill that’s been so well scrutinized can’t do that very badly because with all that scrutiny it wouldn’t have gotten to that stage.
“So it seems to me that there are other motives behind these claims.
“One of the biggest reasons tech companies give people to protest things is the free speech card, and I’m inclined to think, although freedom of speech is a very legitimate concern when I’ve heard people put this card in at this late stage, what they’re really doing is speaking the words of the tech lobbyists, instead of speaking because they believe they do.”
Mr Russell said what is being proposed in the Online Safety Act is a “really good first step” but stressed there must be accountability for people who post content on the platforms without “unduly compromising their privacy”.
He added: “The algorithms also need to be monitored and protected, so if a company is using an unsafe algorithm to promote unsafe material, they must stop it.”
Asked if he believed there was a criminal case to answer in Molly’s case, Mr Russell said: “I think that’s a really difficult question.
“In terms of criminal liability, I think this is something that legislation should address – it is addressed in the Online Safety Act, but I think it should be strengthened.
“Similarly, there is criminal liability for involuntary manslaughter in companies – so if an officer … was found to have run a company in a way that was unsafe and did not adequately protect users of its platform – he could face a jail sentence.” .
“I think that would refocus her thoughts.
“It’s only when their minds are realigned that the corporate culture of these platforms will change — and that’s essentially what’s required for platforms to stop prioritizing profit and start thinking about their users… and about their users’ safety.”
When asked what he was hoping for from social media companies in their responses to the PFD report, Mr. Russell said: “I’m not expecting too much because I’ve learned the hard way.
“They have promised to keep working to make their platforms more secure – but looking now, most of the content Molly has seen is still available on Instagram and other platforms.
“They failed to live up to those words – so I won’t listen to the words, I’ll judge them by their actions.
“I hope they come up with concrete suggestions that make their platforms safe, but only when they implement them will I be happy.”