NEWS

Instagram Bans Self-harm Content: Two Sides to the Story

Becky Banham
By Becky Banham,
updated on Feb 8, 2019

Instagram Bans Self-harm Content: Two Sides to the Story

Instagram has proposed to remove all graphic self-harm images, following the death of a British teenager. But is this the only answer for young self-harmers?

The parents of 14-year-old Molly Russell said she completed suicide in 2017 after viewing distressing material about depression and suicide. Her family saw the posts when they inspected her Instagram account after her death, and concluded the social media giant was partly to blame.

Her father, Ian Russell, said “I have no doubt that Instagram helped kill my daughter,” in an interview with the BBC.

Following a public outcry after Molly's suicide, Instagram chief, Adam Mosseri, admitted that the company has not done enough to keep people safe.

“We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable,” said Mr Mosseri. “We will get better and we are committed to finding and removing this content at scale.”

Instagram currently relies on users to report graphic images of self-harm. However, in its latest press release, Instagram has announced changes to this process, following talks with experts and academics. This comes only a week after the Mental Health Foundation published new guidelines to encourage safe and healthy internet use for the whole family.

In addition to banning 'graphic' images of self-harm, Instagram said it will not promote non-graphic self-harm-related content such as healed scars. Nor will users be able to find these images via hashtags or the explore tab. However, they will not be removing this kind of content entirely, so as not to “stigmatise or isolate people who may be in distress and posting self-harm-related content as a cry for help.”

However, not everyone is keen on Instagram’s proposed changes. April Foreman, a psychologist and a member of the American Association of Suicidology’s board, said in an interview that there is not enough research to indicate that removing graphic images of self-harm will be effective in alleviating suicide risk. “We’re doing things that feel good and look good instead of doing things that are effective,” she said. “It’s more about making a statement about suicide than doing something that we know will help the rates.”

Other critics are keen to point out the likelihood that young people who are searching for self-harm content online, are likely to be harming themselves already. By removing the ability to find this content, the social media giant may be taking away an avenue of support for young people.

Counsellor Peter Klein provides some insight: “Seeking people that are expressing their emotions and suffering online can help self-harmers feel like they are not alone, whilst providing a vehicle to express their owns feelings.”

Although, Peter points out, this community is not always helpful in young people overcoming their mental health problems. “The huge downside is that existing in a community of self-harmers can promote such behaviour whilst also making sufferers far more resistant towards implementing positive change.”

Instagram has vowed to combat this, however, with a focus on creating resources for people who are posting and searching for self-harm-related content. They will also continue consulting with experts to see if there is anything further the platform can do.

Certainly, if you now attempt to search for the hashtag #selfharm, you will be met with new options on your screen - to seek support or confirm that you want to view the content despite Instagram’s warnings.

Speaking after a meeting with Instagram’s boss, Health and Social Care Secretary Matt Hancock described the death of Molly Russell as “every parents' modern nightmare” and said it was right for Instagram to take down “the most graphic material” in order to provide a duty of care for all social media users. Nonetheless, he added that “We need to be led by what the clinicians and experts say need to be taken down” and that he is prepared to legislate if necessary.

Mr Russell has spoken out about Instagram's commitment, and in an interview with the BBC called on other social media to follow suit. “It is now time for other social media platforms to take action to recognise the responsibility they too have to their users if the internet is to become a safe place for young and vulnerable people.”


If you are struggling with distressing feelings and need to talk with someone immediately, please seek support as soon as possible.

You can call Samaritans free on 116 123 (UK and Ireland), email [email protected], or visit the Samaritans website to find details of the nearest branch.

If you are a parent, concerned about the wellbeing of your child, you can find out more about how to help kids and teens stay safe online on Counselling Directory.

Or, if you are struggling and need to talk, you may benefit from seeking professional support. Find an experienced counsellor or therapist near you by simply entering your location in the box below:

Join 100,000+ subscribers

Stay in the loop with everything Happiful

We care about your data, read our privacy policy
Our Vision

We’re on a mission to create a healthier, happier, more sustainable society.