The UK health secretary has stepped up the pressure on social media platforms to do more to deal with harmful content that may encourage suicide and self harm, warning some of the biggest tech groups that laws to regulate them could be introduced.

The move by Matt Hancock comes after the father of Molly Russell, a 14-year-old London schoolgirl who died in 2017, accused social media companies including Instagram and Pinterest of abetting child suicides by hosting disturbing images and content, normalising self-harm and targeting such content at specific users.

“We don’t know but my gut instinct is that various forces and various parts of the internet had a catastrophic effect on Molly’s mental health that led to her suicide,” Ian Russell said in an interview with The Sunday Times.

Mr Hancock told the BBC he would be prepared to introduce legislation if social media companies did not take action. Earlier it emerged the health secretary had written to Twitter, Snapchat, Pinterest, Apple, Google and Facebook, which owns Instagram and WhatsApp.

“We can legislate if we need to,” Mr Hancock told The Andrew Marr Show. “It would be far better to do it in concert with the social media companies, but if we think they need to do things that they are refusing to do, then we can and we must legislate.

“We are masters of our own fate as a nation and we must act to make sure that this amazing technology is used for good, not leading to young girls taking their own lives.”

The government is already developing a white paper addressing “ online harms”, which is likely to include content involving suicide and self harm.

The companies ban content that promotes self-harm and have between them hired tens of thousands of moderators to review posts and pictures that are shared online. However, Mr Hancock’s warning will add to the myriad calls for them to monitor terrorist content, fake news, hate speech and online abuse.

“This is an extremely sensitive issue, and we will continue to work closely with the UK government and our mental health partners to ensure the safety of users is paramount,” Twitter said. Snap and Facebook declined to comment.

Instagram said it had launched a review of policies and technologies related to self-harm, suicide and eating disorders following the news. “While we undertake this review, we are taking measures aimed at preventing people from finding self-harm related content through search and hashtags,” the company said.

A person familiar with the review said Instagram would add a “sensitivity screen” to blur out images of self-harm and block results when users searched for words commonly used to share harmful content.

Pinterest said: We have a policy against harmful content and take numerous proactive measures to try to prevent it from coming and spreading on our platform. But we know we can do more, which is why we’ve been working to update our self-harm policy and enforcement guidelines over the last few months.”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here