Instagram content viewed by Molly Russell which her family argued “encourages” suicide and self-harm was safe, the site’s head of health and wellbeing has said.
Meta executive Elizabeth Lagone was taken through a number of posts the schoolgirl engaged with on the platform in the last six months of her life, which she described as “by and large, admissive”.
The senior executive told an inquest at North London Coroner’s Court she thought it was “safe for people to be able to express themselves”, but conceded two of posts shown to the inquest would have violated Instagram’s policies.
These aren't decisions we make in a vacuum— Elizabeth Lagone, Meta
Coroner Andrew Walker asked Ms Lagone on Monday “what gives you the right” to make decisions on what material was safe for children to view, but the witness said the site worked “closely with experts”, adding that decisions were not “made in a vacuum”.
Molly, from Harrow in north-west London, died in November 2017, prompting her family to campaign for better internet safety.
During the day’s proceedings, videos the teenager accessed on Instagram were played to the court with the coroner once again warning the material had the “potential to cause great harm”.
He said the content “seeks to romanticise and in some way validate the act of harm to young people,” before urging anyone who wanted to leave the room to do so, with one person leaving.
The Russell family’s lawyer, Oliver Sanders KC, spent around an hour taking Ms Lagone through Instagram posts liked or saved by the 14-year-old, and asked if she believed each post “promoted or encouraged” suicide or self-harm.
Referring to one post seen in May 2017, Mr Sanders asked: “Do you think it helped Molly to see this?”
Ms Lagone said: “I can’t speak to this.”
“Six months after seeing this, she was dead,” Mr Sanders continued.
“I can’t speak to the different factors that lead to her tragic loss,” Ms Lagone responded.
The witness told the court she thought the content was “nuanced and complicated”, adding that it was “important to give people that voice” if they were expressing suicidal thoughts.
I think it is safe for people to be able to express themselves— Elizabeth Lagone, Meta
The inquest was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-month period before her death, 2,100 were depression, self-harm or suicide-related.
Mr Sanders directed the witness to a note on Molly’s phone which used the words “I just want to be pretty”, saying the language was identical to a post the teenager had viewed on Instagram two days before.
“Do you see the connection there?” Mr Sanders asked.
Ms Lagone said: “I see that is similar language.”
“It’s identical language… this is Instagram literally giving Molly ideas that she need to be concerned about her weight, correct?” Mr Sanders asked.
Ms Lagone said: “I can’t speak about what Molly may have been thinking.”
Referring to all the material viewed by the teenager the family considered to be “encouraging” suicide or self-harm, Mr Sanders continued: “Do you agree with us that this type of material is not safe for children?”
Ms Lagone said policies were in place for all users and described the posts viewed by the court as a “cry for help”.
“Do you think this type of material is safe for children?” Mr Sanders continued.
Ms Lagone said: “I think it is safe for people to be able to express themselves.”
After Mr Sanders asked the same question again, Ms Lagone said: “Respectfully, I don’t find it a binary question.”
The coroner interjected and asked: “So you are saying yes, it is safe or no, it isn’t safe?”
“Yes, it is safe,” Ms Lagone replied.
The coroner continued: “Surely it is important to know the effect of the material that children are viewing.”
What gives you the right to make the decisions about the material to put before children?— Coroner Andrew Walker
Ms Lagone said: “Our understanding is that there is no clear research into that.
“We do know from research that people have reported a mixed experience.”
Questioning why Instagram felt it could choose which material was safe for children to view, the coroner then asked: “So why are you given the entitlement to assist children in this way?
“Who has given you the permission to do this? You run a business.
“There are a great many people who are … trained medical professionals. What gives you the right to make the decisions about the material to put before children?”
Ms Lagone responded: “That’s why we work closely with experts.
“These aren’t decisions we make in a vacuum.”
The coroner continued: “So having created this environment, you then seek to make it safe?
Ms Lagone replied: “Certainly, we take the safety of users very seriously…”
“What did people do before people created this environment for them?” the coroner asked.
“I’m not sure what you mean,” Ms Lagone said.
“You create the danger and then you take steps to lessen the risk and danger?” the coroner continued.
Ms Lagone said: “The technology has been developed and… we take our responsibility seriously to have the right policies and processes in place.”
Last week, Pinterest’s head of community operations, Judson Hoffman, apologised after admitting the platform was “not safe” when the 14-year-old used it.
Mr Hoffman said he “deeply regrets” posts viewed by Molly on Pinterest before her death, saying it was material he would “not show to my children”.
The inquest, due to last up to two weeks, continues.