Health secretary Matt Hancock has warned that social-media platforms may be banned in the UK if they fail to remove harmful content, following growing concern over suicide and self-harm in teenagers.
On the BBC’s The Andrew Marr Show, Hancock said, “They need to do things they are refusing to do,” adding that, if they don’t regulate their sites, we must introduce legislation. When asked whether social media could be banned, he said, "Ultimately parliament does have that sanction, yes." But it would be much better, he said, if social-media platforms cooperated.
The minister has previously called on tech giants to "purge" harmful material in the wake of a teenager's suicide.
In 2017, Molly Russell, 14, took her own life after viewing disturbing content about suicide on social media, having previously shown "no obvious signs" of severe mental-health issues.
Her family later discovered she had been viewing material on social media related to anxiety, depression, self-harm and suicide.
Russell’s father said, “Some of that content is shocking in that it encourages self-harm, it links self-harm to suicide and I have no doubt that Instagram helped kill my daughter." He further criticised Pinterest, saying, “[It] has a huge amount to answer for."
The family’s solicitor, Merry Varney, said Molly's case and “how algorithms push negative material" show the desperate need to investigate social-media platforms and how they could be "contributing to suicides and self-harm".
Lots of parents feel powerless in the face of social media. But we are not powerless. Both government and social-media providers have a duty to act.
Hancock was "horrified" to learn of Molly's death, and feels "desperately concerned to ensure young people are protected".
He explained: "Lots of parents feel powerless in the face of social media. But we are not powerless. Both government and social-media providers have a duty to act.”
"I want to make the UK the safest place to be online for everyone – and ensure that no other family has to endure the torment that Molly's parents have had to go through."
Papyrus, a charity that helps prevent teenage suicide, says it’s received messages from around 30 families in the past week who believe social media had something to do with children's suicides.
A spokesperson for the charity said, "We've had a spike in calls to our UK helpline since the BBC first reported this six days ago, all saying the same thing.”
Instagram responded by saying it works with expert groups who advise it on the "complex and nuanced" issues of mental health and self-harm. It added that sharing stories and connecting with others could be helpful for recovery. Instagram, which is owned by Facebook, admitted it "[doesn’t] remove certain content", but added the company is undertaking a full review of its enforcement policies and technologies.
A Pinterest spokesperson said: "We have a policy against harmful content and take numerous proactive measures to try to prevent it from coming and spreading on our platform.”
"But we know we can do more, which is why we've been working to update our self-harm policy and enforcement guidelines over the last few months."