Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Shanti Das

Instagram still hosting self-harm images after Molly Russell inquest verdict

Ian Russell, the father of Molly Russell, speaks to media outside Barnet coroners court
Ian Russell, the father of Molly Russell, speaks to media outside Barnet coroner’s court in north London, after the inquest into his daughter’s death. Photograph: Joshua Bratt/PA

Instagram is breaking its promise to remove posts that glorify self-harm and suicide years after the death of the schoolgirl Molly Russell, Observer analysis has found.

The photo-sharing app has long claimed it does not allow material that “promotes or glorifies self-harm or suicide” and says it removes content of this kind.

In February 2019, in the wake of 14-year-old Molly’s death in 2017, it said it would no longer allow any content depicting graphic self-harm, such as cutting. Eight months later, it extended that ban to cover cartoons and drawings after an appeal from Molly’s father, Ian.

But simple keyword searches on the platform last week showed that many such images remain live. The posts include pictures uploaded since the inquest in September into Molly’s death, which concluded that she died from “an act of self-harm while suffering from depression and the negative effects of online content”. Other images were posted several years ago but were not detected or removed by the platform’s review teams or technology.

Many of the posts were made under hashtags including slight misspellings of banned or restricted searches. Users searching for banned terms minus the vowels, for example, were shown a stream of posts that appeared to go against Instagram’s community guidelines.

Several keywords relating to self-harm and suicide led to users being shown a warning that said: “Can we help? Get support”. But directly beneath, users could click “show posts”. Top results for one common misspelling of a banned term included a photo of blood spattered on a tiled floor and a picture of a coffin with words over the top: “Now everyone loves me”.

Another top result showed a cartoon depiction of someone who had taken their own life – a clear breach of Instagram’s rules. It was posted in April 2018, five months after Molly’s death, and remained live until this weekend, when it was removed after being flagged with Instagram by the Observer.

In other cases, users searching for banned terms were shown a list of alternative hashtags that they could try searching instead. Some of the suggestions related to keywords associated with recovery and self-harm prevention, but others did not. Under one suggested tag, top posts, which have since been removed, included images that appear to normalise or glamorise self-harm, such as a meme describing fresh cuts as “perfect” and a photo of a lighter with the words: “We’re all addicted to something that takes the pain away.”

Molly when a bridesmaid, holding flowers
Molly Russell in 2016. Photograph: The Russell family

Instagram says it allows some self-harm and suicide-related content on the advice of experts because it does not want to exclude people in need of help or prevent them from expressing how they feel. It says it shows pop-up support messages, hides posts that may be sensitive and promotes content relating to recovery and prevention.

But the findings suggest that the platform, which permits users as young as 13, is failing to effectively enforce its own policies on barring material that promotes or glorifies self-harm, and raise concerns about the effect on those who see it.

Anna Edmundson, head of policy at the NSPCC, said: “It is absolutely shocking that this appalling content can still be found on Instagram and that tech firms continue to get away with playing by their own rules. It is insulting to Molly Russell’s family and other young people and their families who have suffered the most horrendous harm from this content that it is still available.”

The inquest into Molly’s death, at north London coroner’s court last month, heard that she had saved, liked or shared 2,100 pieces of content related to suicide, self-harm and depression in the six months before she died, including drawings, memes and graphic images. She last used her phone to access Instagram at 12.45am on the day she died, the court was told. Two minutes earlier, she had saved an image that carried a depression-related slogan.

On Friday, Molly’s father, Ian Russell, called on the government to take urgent action to regulate online platforms, telling the BBC there was “no time for delay”. A long-awaited online safety bill is expected to return to parliament in the near future.

Critics have claimed the bill in its current form focuses too much on policing types of content rather than tackling the “algorithmic systems and design features” that underpin the biggest platforms. Dr Susie Alegre, a human rights lawyer, said it “not only threatens free speech … but also fails to do enough to tackle the real drivers of online harm, such as social media companies actively recommending posts about self-harm”.

Meta, Instagram’s parent company, said it had removed violating posts flagged by the Observer. A spokesperson said: “Our thoughts are with the Russell family and everyone who has been affected by this tragic death. We’re committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers, and we will carefully consider the coroner’s full report when he provides it.”

A spokesperson added that the platform blocks hashtags that inherently go against its policies. She said people would always try to get around the rules by using deliberate misspellings, adding that its work on this was “never done”.

In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org. In the UK and Ireland, Samaritans can be contacted on 116 123 or by emailing jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at www.befrienders.org. You can contact the mental health charity Mind by calling 0300 123 3393 or visiting mind.org.uk

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.