Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Independent UK
The Independent UK
National
Lydia Patrick

‘My 13-year-old daughter downloaded TikTok and six months later she was dead’

Provided

“Daddy, may I download TikTok?” Maia politely asked her father, five months after she had turned 13.

Liam Walsh double-checked the legal age for the app and said he trusted his mature child to be able to use it responsibly. Like other teenagers, Maia refused to listen to her parents’ calls to get off her phone as she spent hours lost in the virtual world.

But six months later, at 3.15am on 7 October, Mr Walsh received a phone call that will haunt him forever.

“Maia is dead,” her mother screamed. His little girl had taken her own life three weeks before her 14th birthday.

The father doted on the daughter he described as an ordinary, cheerful 13-year-old girl who excelled at school and enjoyed playing Minecraft.

“She was a sweet kind chirpy child until TikTok happened,” he said. “My world turned to bits. In hindsight, I look back and wonder if anything going on, I don’t know if she was dragged into a world she didn’t have the maturity to understand.”

Liam Walsh described his daughter as a cheerful and happy girl and fears she was dragged into a ‘disgusting wormhole of risk’ through algorithms
— (Provided)

Mr Walsh is one of a number of bereaved families sharing their harrowing stories of losing children in circumstances where social media is suspected of playing a part, after Ofcom laid out their first steps to regulating the online sphere.

Ofcom’s Chief Executive Dame Melanie Dawe said the regulator “cannot waste a moment” on setting out how they protect children online. Ofcom will work alongside social media sites to ensure harmful material such as child sexual abuse material, grooming and violent content are removed.

Mr Walsh is now campaigning for full transparency of Maia’s online data in the months leading up to her death. As it stands, social media platforms have a 90 day retention policy.

The 48-year-old says he saw two disturbing videos romanticising self-harm and suicide from his daughter’s TikTok like history seen by The Independent.

“My quest is to find out what led my child to do something so destructive and dangerous, Maia is missing her life. “Maybe there is nothing to see, but I deserve to have the chance to find out.”

Isaac Kenevan‘s mother says he died after taking part in an online choking challenge
— (Provided)

When asked for a comment, TikTok cited their community guidelines which state they do not allow showing, promoting, or sharing plans for suicide or self-harm, they say any content violating these terms will be removed. The social media platform also says it releases quarterly community guidelines enforcement reports to hold itself accountable and claims 98.1% of the content removed in quarter two of 2023 was removed proactively.

TikTok added they work closely with Samaritans and the International Association for Suicide Prevention to remove harmful content whilst allowing users to share their issues in a safe way.

But the Walshes are from the only family to suffer such a painful loss. Isaac Kenevan was a typical teenage boy who loved gaming and sports. Last March, a coroner ruled he died due to ‘misadventure’ after his mother Lisa found him lifeless on the bathroom floor.

The mother from Essex believes his death was due to an online ‘choking’ challenge, and says the police found two videos of him taking part in it on his phone.

“My son was a typical teen, he was a happy boy with friends,” the 51 year-old said. “He was just very inquisitive he enjoyed doing all the online challenges such as the ice bucket and bottle flip challenge. We never imagined it would do him any harm.

“He was such a joyous child, he didn’t have a harmful thing in his body. I can’t watch another family go through what we have.”

Olly Stephens was killed after he was targeted online, a court heard
— (Provided)

Mrs Kenevan is campaigning for the rights of parents and coroners to be able to request online data going back further than three months.

It is not just exposure to troubling material where concerns are being raised - social media has also played a part in an increasing number of court cases.

Olly Stephens was a kind child from a sleepy Reading suburb who fell victim to gang culture when he was targeted online.

The 13-year-old became aware of a friend being ‘patterned’ - where a victim ’ is humiliated on camera and then blackmailed with the footage. Olly tried to alert the victim’s older brother but the perpetrators in the video accused Olly of snitching and started abusing and controlling him, a court heard when his young killers went on trial.

On a Sunday afternoon, 3 January, 2021, the teenager shouted goodbye to his parents but never returned.

Olly and his father loved the Red Hot Chili Peppers and went to see them at Reading Festival
— (Provided)

A 14-year-old girl lured Olly to meet two boys, then aged 13 and 14, who killed him. The two boys were convicted of murder at Reading Crown Court in September 2021. The 14-year-old girl admitted to manslaughter and was sentenced to three years and three months in a young offenders institute.

His father Stuart Stephens, 54, heard the screams as his son was stabbed to death just metres away from their family home.

“I ran over to the field without my shoes. I knew he had gone,” he said. “I held his hand, I was begging him not to leave me. He was completely lifeless.

He described the feeling of ‘impending doom’ as he pleaded with his son to tell him what was going on as the “cheeky” and “funny” boy became a shell of his former self leading up to his death.

“We never understood the depth of the abuse he was receiving,” his father added. “Surely these apps have some kind of duty of care.”

Michelle Donelan, Secretary of State for Science, Innovation, and Technology, has said the government will ‘clean up the wild west of social media’
— (PA Wire)

A Snapchat spokesperson said: “This case is horrific and there is no place for this on Snapchat. Just one experience like this is one too many, but this is not reflective of the majority of the 21 million people who use Snapchat in the UK.

“We have clear rules about what’s allowed, and teams of moderators who review content reported to us or detected by our cutting-edge technology. Even though Snaps disappear, in many cases, evidence does not — we can preserve content and take action, including working with police to support investigations.

“Being a platform popular with young people comes with additional responsibilities. That’s why we have extra protections for under 18s, parental tools to help families know who their teens are talking to, and why we work with experts on all aspects of safety across the platform.”

Ofcom’s first aim is to remove minors from friend suggestion lists and blocking strangers’ messages, they also hope to improve automatic detection and removal services of harmful content - including dangerous suicide and self-harm content.

Technology Secretary Michelle Donelan said the publication of the first codes marked a “crucial” step in making the Online Safety Act a reality by “cleaning up the wild west of social media and making the UK the safest place in the world to be online”.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.