Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Caroline Kimeu in Nairobi

‘The work damaged me’: ex-Facebook moderators describe effect of horrific content

The exterior of Samasource's offices in Nairobi
Former workers have described how some of the images they saw while working for Samasource have remained etched in their minds. Photograph: Tony Karumba/AFP/Getty Images

When James Irungu took on a new job for the tech outsourcing company Samasource, his manager provided scant details before his training began. But the role was highly sought after and would nearly double his pay to £250 a month. Plus it offered a path out of Kibera, the vast shantytown on the outskirts of Nairobi where he lived with his young family.

“I thought I was one of the lucky ones,” the 26-year-old, said. But then he found himself ploughing through heaps of violent and sexually explicit material, including grisly accidents, suicides, beheadings and child abuse.

“I remember one day when I logged in to see a child with their stomach torn wide open, suffering but not dead,” the Kenyan national told the Guardian. It was seeing child exploitation material “when it really kicked in that this was something different”.

He had been hired by Samasource to moderate Facebook content, weeding out the most toxic posts. Some of the most tormenting images remained etched in his mind, occasionally jolting him awake in night sweats. Fearing that opening up about his work would evoke discomfort, concern or judgment from others, he kept it to himself.

Exasperated by his “secretiveness”, his wife grew distant. Irungu resigned himself to them drifting apart, convinced he was protecting her and stayed in the job for three years. He says he regrets pressing on.

“I don’t think that work is suitable for human beings,” he said. “It really isolated me from the real world because I started to see it as such a dark place.” He became afraid to let his daughter out of his sight.

“When I ask myself if the money was really worth sacrificing my mental health for, the answer is no.”

Another former moderator said she was alarmed at some of the content and some fellow workers dropped out. But she found purpose in assurances from her managers that their work protected users, including young children like hers.

“I felt like I was helping people,” she said. But when she stopped, she realised that things she had normalised were troubling.

She remembered once screaming in the middle of the office floor after watching one horrific scene. Except for a few glances from co-workers, and a team leader pulling her aside to “go to wellness” counselling, it was like nothing had happened, she said. The wellness counsellors told her to take some time to rest and get the image out of her head.

“How do you forget when you’re back on the floor after a 15-minute break, to move to the next thing?” she said. She wondered if the counsellors were qualified psychotherapists, saying that they would never escalate a case for mental healthcare no matter what the moderators had seen or how distressed they were.

She went from being the kind of person who would host friends at every occasion to barely leaving her house, crying at the deaths of people she did not know to feeling numb and mentally troubled, sometimes battling suicidal thoughts.

“The work damaged me, I could never go back to it,” said the woman , who hopes that the case will have impacts on the content moderation industry in Africa, as global demand for such services grows.

“Things have to change,” she said. “I would never want anyone to go through what we did.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.