Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo Global technology editor

‘The bleakest of worlds’: how Molly Russell fell into a vortex of despair on social media

Molly Russell was described as ‘someone full of love and hope and happiness, a young person full of promise and opportunity and potential’.
Molly Russell was described as ‘someone full of love and hope and happiness, a young person full of promise and opportunity and potential’. Photograph: The Russell family

On the evening of 20 November 2017 Molly Russell and her family had dinner together and then sat down to watch an episode of I’m a Celebrity … Get Me Out of Here!.

A family meal, then viewing a popular TV show: a scene typical of millions of families around the UK. As Molly’s mother, Janet, said to police: “Everybody’s behaviour was normal” at dinner time.

The next day, at about 7am, Janet went to Molly’s bedroom and found her daughter’s body.

Molly, 14, from Harrow, north-west London, had killed herself after falling, unbeknown to her family, into a vortex of despair on social media. Some of the content viewed in the final year of her life was unrecognisable from prime time family TV.

It was, as Molly’s father, Ian, put it at the inquest into his daughter’s death, “just the bleakest of worlds”.

“It’s a world I don’t recognise. It’s a ghetto of the online world that once you fall into it, the algorithm means you can’t escape it and it keeps recommending more content. You can’t escape it.”

On Friday the senior coroner at North London coroner’s court ruled at the end of a two-week hearing that Molly had died from an act of self-harm while suffering from depression and “the negative effects of online content.”

In many ways Molly had the interests and hobbies of a typical teenager: the musical Hamilton, the rock band 5 Seconds of Summer, the lead role in her school show. Ian Russell emphasised this part of Molly’s life as he paid an emotional tribute to her at the start of the inquest at North London coroner’s court, talking of a “positive, happy, bright young lady who was indeed destined to do good”.

Ian Russell arriving at North London coroner’s court in Barnet on the first day of the inquest into his daughter’s death.
Ian Russell arriving at North London coroner’s court in Barnet on the first day of the inquest into his daughter’s death. Photograph: Kirsty O’Connor/PA

He said: “It’s all too easy to forget the person she really was: someone full of love and hope and happiness, a young person full of promise and opportunity and potential.”

But Russell said the family had noticed a change in Molly’s behaviour in the last 12 months of her life. She had become “more withdrawn” and spent more time alone in her room, her father said, but still contributed “happily” to family life. The Russells put her behaviour down to “normal teenage mood swings”.

In September 2017 Russell told his daughter the family was concerned about her, but she described her behaviour as “just a phase I’m going through”. Indeed, Russell said Molly appeared to be in “good spirits” in the final two months of her life.

Some of Molly’s social media activity – music, fashion, jewellery, Harry Potter – reflected the interests of that positive, bright person depicted by her father.

But the darker side of Molly’s online life overwhelmed her. Of 16,300 pieces of content saved, liked or shared by Molly on Instagram in the six months before she died, 2,100 were related to suicide, self-harm and depression. She last used her iPhone to access Instagram on the day of her death, at 12.45am. Two minutes before, she had saved an image on the platform that carried a depression-related slogan.

It was on Instagram – the photo-, image- and video-sharing app – that Molly viewed some of the most disturbing pieces of content, including a montage of graphic videos containing clips relating to suicide, depression and self-harm set to music. Some videos contained scenes drawn from film and TV, including 13 Reasons Why, a US drama about a teenager’s suicide that contained episodes rated 15 or 18 in the UK. In total, Molly watched 138 videos that contained suicide and self-harm content, sometimes “bingeing” on them in batches including one session on 11 November.

A consultant child psychiatrist told the hearing he couldn’t sleep well for weeks after viewing the Instagram content seen by Molly just before her death.

As the court went through the six months of Instagram content, it was shown a succession of images and clips that contained slogans relating to suicide and depression, or graphic images of self-harm and suicide. Some content, such as the video clips, was repeated more than once in court, giving those present an idea of how Ian Russell felt when he said the “relentless” nature of the content “had a profound adverse impact on my mental health”.

The court was told Molly had left “behind a note that quotes” a depressive Instagram post she had viewed, while a separate note started on her phone quoted from one of the video montages. Oliver Sanders KC, representing the Russell family, said “this is Instagram literally giving Molly ideas”.

Elizabeth Lagone, the head of health and wellbeing policy at Meta, the owner of Instagram and Facebook, was ordered by the coroner to fly over from the US to give evidence and was taken through many of the posts and videos by Sanders. She defended the suitability of some of the posts, saying they were “safe” for children to see because they represented an attempt to raise awareness of a user’s mental state and share their feelings. Sanders questioned whether a 14-year-old could be expected to tell the difference between a post raising awareness of self-harm and one that encouraged it.

Elizabeth Lagone, Meta’s head of health and wellbeing arriving at North London coroner’s court
Elizabeth Lagone, Meta’s head of health and wellbeing, arriving at North London coroner’s court. Photograph: Beresford Hodge/PA

Some content was clearly indefensible, even under Instagram’s 2017 guidelines, and Lagone apologised for the fact that Molly had viewed content that should have been taken off the platform, because it glorified or encouraged suicide and self-harm.

But the content that Lagone sought to defend – as, for example, “an expression of somebody’s feelings” – drew expressions of exasperation from Sanders. He questioned how posts containing slogans like “I don’t want to do this any more” could be appropriate for a 14-year-old to view.

Raising his voice at one point, he said Instagram was choosing to put content “in the bedrooms of depressed children”, adding: “You have no right to. You are not their parent. You are just a business in America.” Instagram has a minimum age limit of 13, although Molly was 12 when she set up her account.

The Pinterest footage was also disturbing. The inquest was told Molly had used the platform, where users collect images on digital pinboards, and had searched for posts under terms like “depressing qoutes [sic] deep”, and “suicial [sic] qoutes”.

One board in particular, which Molly titled “nothing to worry about…”, contained 469 images, some of them related to self-harm and suicide. Others related to anxiety and depression, while it emerged that Pinterest had sent content recommendation emails to Molly with titles such as “10 depression pins you might like”.

Jud Hoffman, the head of community operations at Pinterest, told the inquest he “deeply regrets” what Molly saw, and that the platform was not safe at the time.

Hoffman also said the platform was still “not perfect” and that content violating its policies “still likely exists” on it. Campaigners for internet safety, such as the Russell family, argue that this applies to other platforms as well.

Jud Hoffman, global head of community operations at Pinterest
Jud Hoffman, global head of community operations at Pinterest. Photograph: James Manning/PA

The court also heard Molly had a Twitter account that she used to contact Salice Rose, an influencer who has discussed her experience of depression online, in an attempt to gain help. Ian Russell described it as “calling out into a void” and said it was a “danger” for people like Molly to seek support from well-meaning influencers who could not offer specialist support.

He also looked at Molly’s YouTube account after her death and found a “high number of disturbing posts” concerning anxiety, depression, self harm and suicide.

Throughout the hearing the senior coroner, Andrew Walker, raised potential changes to how social media platforms operate with regard to child users. Change has already arrived with the age-appropriate design code, which prevents websites and apps from misusing children’s data, while the forthcoming online safety bill will impose a duty of care on tech firms to protect children from harmful content.

In a pen portrait of his daughter read out to the inquest, Ian Russell said he wanted to deliver a message of hope alongside the loss: that a tragedy played out against a backdrop of poorly regulated social media platforms must not be repeated.

“Just as Molly would have wanted, it is important to seek to learn whatever we can and then to take all necessary action to prevent such a young life being wasted again.”

• In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org, and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counselor. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.