Social media platforms are not doing enough to prevent users, especially young people, from seeing and being affected by self-harm and suicide content, according to a new study.
A survey of social media users by Samaritans and Swansea University found that 83% of respondents had been recommended self-harming content without researching it.
Although the researchers said the study used a social media campaign to encourage people to complete an online survey on the issue, and noted that this may have had an impact on the outcome, as it was more likely that people with experience of self-harm and suicide would have chosen to participate, they said it still underscored how harmful such content could be, especially for vulnerable young people.
The survey showed that 76% of those who had seen self-harm or suicide content said they had hurt themselves more seriously because of it.
Additionally, it found that three-quarters of participants first saw self-harm content online when they were aged 14 or younger, with the charity urging platforms to do more now to protect their users rather than waiting for regulation. be imposed on them.
The vast majority of respondents (88%) said they wanted more control over filtering the content they see on social media, while 83% said they thought more specific trigger warnings, such as the use of terms such as self-harm or suicide in content warnings. , would be useful to them.
“We would never put up with people pushing this kind of material uninvited into our mailbox, so why should we allow it to happen online,” Samaritans chief executive Julie Bentley said.
“Social media sites are simply not doing enough to prevent people from seeing clearly harmful content and they need to take it more seriously.
“People don’t control what they want to see because sites don’t make changes to prevent that content from being delivered to them and that’s dangerous.
“Vites need to put more controls in place, along with better signage and improved age restrictions.
“The Online Safety Bill must become law as soon as possible to reduce access to all harmful content on all sites, regardless of size and most importantly to ensure this is addressed at the same time for children and adults.
“We are eagerly awaiting the Bill to return to the House of Commons after many delays, but there is nothing stopping the platforms from making changes now.
“The internet is moving much faster than any legislation, so platforms shouldn’t wait for it to become law before making life-saving changes that could save lives.”
Professor Ann John, from Swansea University and co-lead of the study, said more research into the subject was needed to get a clearer picture of the national impact of such content, but said said it was clearly detrimental to many people.
“While our study cannot claim to represent the entire population’s experience of this content since only interested individuals would have responded to our inquiries, many of the themes clearly indicate how social media platforms can improve,” she said.
“People want more control over the content they see, ways to make sure kids meet age requirements, and co-produced safety features and policies. It all sounds very doable.