Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Salon
Salon
Politics
Amanda Marcotte

Shutting down the right-wing rabbit hole

"The man I loved wasn't there anymore — and instead this monster that had the most horrible thoughts about people was in its place."

When the woman I'll call Ann married her husband in 2002, he was "someone who couldn't care less about anything political at all." Over the years, he drifted into being a Republican, but it wasn't until 2017, after the election of Donald Trump, when she says "his radicalization and intro to conspiracy theory happened." Within the last few months, Ann told me, her husband began telling their children "how the people behind Monster Energy drink are obviously Satanists because they hid symbols on their can."

I met Ann on the Reddit forum QAnon Casualties, where family members and friends of QAnon believers and other far-right conspiracy theorists come to commiserate. Like several other members of the forum I contacted, she requested that Salon not publish her real name.

Ann hung in there for years, sticking with her husband through the far-right conspiracy theories and the flat-out weird ones. "He tried to convince me that the NFL was run exactly like the WWE in that it was entirely scripted," she told me. Eventually, with the help of a therapist, she came to the conclusion that "there is literally nothing I can say to bring him back."

That realization "has made things easier to handle," Ann says. Before that, "all I could think about was how badly I wanted to die," she continued. "I was scaring myself with how badly I just wanted out." With her therapist's aid, she says she is now  planning an exit strategy from her marriage. 

Ann's journey is one that untold numbers of people have endured in recent years: watching a loved one become radicalized through online disinformation. Once such people have disappeared down the proverbial "rabbit hole," it can sometimes be impossible to get them back. Preventing people from falling into the disinformation abyss in the first place is obviously crucial — and the good news is that prevention is possible. Experts already know a lot about both why and how people get radicalized, but the difficult part is interrupting the process by which vulnerable people are exposed to ever more vicious propaganda that lures them into the darkest caverns of social media. 

One of the most promising avenues for prevention has emerged from a surprising place: Parents and schools who have made it a mission to battle social media addiction. They're using the same tools that proved so effective at curtailing a different and even deadlier public health menace: cigarette smoking. Only this time around, instead of suing Philip Morris and other big tobacco companies, they're going after Meta, Twitter, TikTok, YouTube and Snapchat. 

Another QAnon Casualties poster who goes by Tristan Penifel (also not his real name) told me that Ann's story is highly typical of visitors to the forum. People show up initially, he said, wondering how they can debunk the wild stories and outrageous claims they're hearing from a formerly normal loved one. Tristan says he and the other forum regulars gently try to steer the newbies away from that path. The old hands know that it's nearly impossible to argue people out of false beliefs they've picked up online, once those have become ingrained in their identity. 

Tristan told me his father fell deep into the realm of right-wing conspiracy theory years ago, and became obsessed with the claim that Barack Obama had been born in Kenya. "I spent hours one evening debunking every single thing that he could find about the birth certificate being fake," Tristan explained in a Zoom interview. "By the end of all of that, he was like, OK, maybe the birth certificate is real."

Preventing people from falling into the disinformation abyss in the first place is obviously crucial — and the good news is that experts say that's possible.

But Tristan's victory was short-lived. His father immediately pivoted to another conspiracy theory that claimed Obama "was handpicked by the banks to protect them after the 2008 financial crisis." Tristan's father had once been involved in Occupy Wall Street, but the allure of online conspiracy theories had pulled him far to the right.

Tristan went on to write about his experiences at Medium, arguing that "all of this is ultimately on the conspiracy theorist. They are the one who has some kind of emotional sickness driving them into these beliefs, and they are the one who can cure themselves. If they refuse to engage, no one else can save them."

None of this comes as a surprise to experts who research far-right groups and conspiracy theorists. What the members of QAnon Casualties see happening to their loved ones is much bigger and more systematic than a small subset of the population adopting some kooky notions. Instead, they have become radicalized, and recent history tells us that while most people who believe outlandish conspiracy theories will not commit violent acts, the danger is very real. 

Even when they don't commit violence, "people who get involved in these movements destroy their lives," David Neiwert, author of "The Age of Insurrection: The Radical Right's Assault on American Democracy," told me in a recent interview. "It draws people into the abyss. It ruins their family relationships, ruins their relationships in the community."

Early in the Trump administration, activists like Christian Picciolini, who has worked to help deradicalize white nationalists, attracted a flurry of media attention. Most experts believe, however, that convincing someone who has dug themselves deep down the rabbit hole is a difficult and unpredictable process. 

"Deradicalization is even more personal and idiosyncratic than radicalization," Brian Hughes, an American University professor who co-founded the Polarization and Extremism Research and Innovation Lab (PERIL), told me. "Deradicalization is something that you do after the worst has already happened," he continued, meaning "after a person has really made a grave social and even moral mistake, or sometimes committed an act of violence."

Even at that point, it frequently doesn't stick. Even as defendants from the Jan. 6 Capitol attack face prison sentences, there's a common thread to many of their stories: They're totally not sorry. Jacob Chansley, the infamous "QAnon shaman," got some relatively sympathetic media coverage after his arrest, when his defense attorney portrayed him as a regretful dupe. Now that he's out of jail, however, Chansley has made clear that he's still a member of the QAnon faithful.

Even as defendants from the Jan. 6 Capitol attack face prison sentences, there's a common threat to many of their stories: They're totally not sorry.

Chansley told the Arizona Mirror he was "not a big fan" of his now-former lawyer, "after I found out all the things he was saying in the media without my consent. He said that I felt duped by Trump. I never said that. I never asked him to say that. He said that I denounced Q and the QAnon community. I never said that."

Lisa Sugiura of the University of Portsmouth encountered a similar phenomenon, she said, while researching her book "The Incel Rebellion: The Rise of the Manosphere and the Virtual War Against Women." While some men she interviewed identified as former rather than present-tense incels, Sugiura reports that many still share the same misogynistic views found on the "involuntary celibate" forums they have supposedly left behind.

"They will still say, 'Oh, well, you know, women still get away with a 'pussy pass,'" or that "women shouldn't be so picky," she said. "It's very depressing. You think, well, is there a way out?"

"There is no getting him back. He is very much dead to me," said George Fincher, who lives in Birmingham, England, and agreed to use his real name. He was talking about his stepfather, a former Royal Marine who has gone deep down the online QAnon rabbit hole. Fincher said he suspects that his stepfather is battling unresolved trauma, and in the process has alienated his family.

"He doesn't speak to his kids. He's met his grandchild once," Fincher explained. "Probably, in his mind, the only thing that he has going for him is this [QAnon] idolatry. Because he certainly doesn't have a family anymore."

This comports with what researchers know about people who become radicalized. PERIL uses a "supply and demand" model to explain the process. Hughes laid out this theory in a legal brief filed in a lawsuit after Steven Carrillo, an Air Force sergeant affiliated with the extremely online Boogaloo movement, shot three law enforcement officers in California, killing one of them. 

"On one hand, extremist recruiters and propagandists offer a supply of ideological material, imagery, entertainment, and opportunities to organize with like-minded extremists," Hughes wrote. "On the other hand, individuals pursue radicalization because it meets certain social and psychological needs — this is the demand side of radicalization." 

As Hughes told Salon in an interview, "In the days before the internet, a person would have to be very lucky — or very unlucky, depending on how you want to look at it — to encounter extremist propaganda or an extremist recruiter." Neo-Nazis used to trawl hardcore punk clubs looking for vulnerable kids; white nationalists would look for prospects target at gun shows. "What's changed nowadays with the internet," he said, "is that you can't avoid radicalizing material. Propaganda is everywhere."

Evidence that the internet has accelerated radicalization is also everywhere, as anyone who has watched Facebook friends melt down in real time can attest. Researchers at the Public Religion Research Institute (PRRI) found in 2022 that "nearly one in five Americans (16%) are QAnon believers." Among Republicans, that proportion rose to 25%.

Beyond the damage done to individuals, families and the body politic, there is compelling evidence that online radicalization has fueled a rapid rise in extremist violence. The Government Accountability Office reported a nearly fourfold rise in domestic terrorism cases from 2013 to 2021. In many of the most dramatic examples, a common factor is online radicalization. Consider "incel killer" Elliot Rodger in California, white supremacist mass murderer Dylann Roof in South Carolina and the mass shootings in a Pittsburgh synagogue, an El Paso Walmart and a Buffalo supermarket. These dreadful cases are alarmingly similar: A young man radicalized by online propaganda decided to act on his bigoted delusions with real-world violence.

A Facebook whistleblower opens the door

In the fall of 2021, former Facebook employee Frances Haugen began to release reams of internal company documents exposing all manner of embarrassing secrets: Facebook had knowingly let disinformation flourish on its platform, had turned a blind eye to hate speech and overt incitements to violence, and deliberately targeted underage users, despite internal research showing that social media overuse could be dangerous to minors. 

"There were conflicts of interest between what was good for the public and what was good for Facebook," Haugen told "60 Minutes." "Facebook, over and over again, chose to optimize for its own interests, like making more money."

In many of the most dramatic examples of domestic terrorism in the past decade, there's a common factor: The perpetrators were young men who became radicalized online.

The Facebook document dump was a big scandal, at least at first. But media attention faded rapidly, especially as Elon Musk flirted with buying Twitter and then finally did so, shifting the locus of concern over social media disinformation away from Facebook and toward the "bird site." But Haugen's whistleblowing clearly had an impact on government and the legal system in terms of one crucial issue: The mental health impacts of social media algorithms on teenagers and children.

Facebook whistleblower Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee on October 05, 2021 in Washington. (Matt McClain/The Washington Post via Getty Images)

Over the past year and a half, there have been multiple congressional hearings about the health risks of social media for underage users. A Centers for Disease Control report on the teen mental health crisis included concerns about the impacts of social media. In May, Surgeon General Dr. Vivek Murthy released an advisory noting "growing evidence that social media use is associated with harm to young people's mental health." While admitting that the phenomenon is complex and that social media access can be beneficial for some kids in some circumstances, he argued that unregulated, excessive use was a likely contributing factor to the "national youth mental health crisis."

What mental health risks are we talking about here? This remains a fraught topic, where research is continuing. What's discussed most often are issues like low self-esteem, suicidal ideation, eating disorders, insomnia and attention disorders. But there is also evidence that the potentially addictive qualities of social media — although the term "addictive" remains controversial — contribute to right-wing radicalization. So one way of framing the problem, and a potential solution, is to argue that if social media companies can be forced to curtail the business strategies that lead users down right-wing rabbit holes and negatively affect mental health, it would be much more difficult for far-right movements to recruit online. 

Legislative approaches to regulating social media are still in their infancy, and are likely to encounter strong resistance. But there's one promising channel that could move a lot faster: Litigation. A growing number of lawsuits filed by school districts, parents and young people themselves are claiming that social media companies deliberately design their products to be addictive — or at least to draw in users for longer periods of time and maximize "engagement" — which is contributing to the youth mental health crisis. 

"The end goal is to make young people engage with and stay on the platforms as long as possible, because that means they can sell more advertising," lawyers for the Cabrillo Unified School District in Northern California argue in a district court lawsuit filed in May. "The YouTube, TikTok, Snap and Meta companies have learned that this is best accomplished by catering an endless flow of the lowest common denominator of content that is most provocative and toxic that they can get away with."

Anne Murphy, one of the plaintiff's lead attorneys, told Salon: "There's been good research done that shows that social media's effect on the brain is very similar to the effects that you have with gambling or even taking recreational drugs."

"What we are looking at is the 'public nuisance' legal theory, which allows government entities to hold companies liable for unique damages caused as a result of a company's conduct," said Ron Repak, a lawyer who is representing Pittsburgh-area schools in a similar lawsuit.

The lawyers who spoke to Salon pointed to precedent-setting lawsuits of years past, based on claims that companies had deliberately addicted their customers to harmful products in search of greater profitability. The most famous of these were the consumer protection cases against tobacco companies that began in the 1990s, which were so successful the federal government joined in. In 1999, the Department of Justice won a racketeering suit against Philip Morris and other tobacco companies, successfully claiming those huge corporations had systematically defrauded the public by lying about the health risks of smoking. Now the school districts suing social media companies are making similar claims, also from a position of governmental authority. Similar lawsuits are being filed by parents who allege direct damages to their children. 

Lawyers in these cases emphasize that their goal is not strictly financial. They also hope that a measure of real accountability can force social media companies to rethink how they do business.

The lawyers who spoke to Salon emphasized that their goal is not strictly financial, although their claims are partly based on the taxpayer money lost because schools must spend more on mental health and security. They also hope that a measure of real accountability can force social media companies to rethink how they do business. Ira Weiss, who represents a different Pittsburgh-area school district, said he believes that these kinds of lawsuits — against the tobacco giants, Big Pharma and vape manufacturers, for instance — can also help educate the public and political leaders. 

During the discovery process, Weiss explained, internal company documents will likely be obtained, and "there will be depositions taken where all this marketing and technology strategy will become known." (One recent model that made headlines was the defamation suit against Fox News by Dominion Voting Systems, which exposed a great deal of embarrassing material about the right-wing cable network's internal operations.)

Theories about exactly how social media addicts people or lures them in deeper, especially younger people, are still being developed. It's fair to say the mechanism is not exactly understood, as was also true of nicotine addiction for many years. But there can be little doubt that social media companies program their algorithms to make scrolling seem almost irresistible. As Haugen put it, Facebook's internal research made clear that its products were "designed to be engaging," and also that "that can lead to very high rates of what we call 'problematic use.'" As an article for NBC's "Today" site explains it, "That's how an innocent search for 'healthy recipes' on Instagram might lead a teenager to eating disorder content instead." 

That's what is meant by the ubiquitous term "rabbit hole": Users may begin with relatively innocuous content, but to keep them "engaged," social media algorithms keep serving up ever more extreme — and, yes, more engaging — content that can plant dark thoughts and provoke deep insecurities. One lawsuit filed by California parents cites Pew research showing that, for some users, "the more they use social media, the more they are drawn down the rabbit hole into further use, making it increasingly difficult for them to stop."

"Addicted" to QAnon, misogyny and the Boogaloo

The same rabbit-hole phenomenon that can draw social media users deeper into the world of eating disorders or suicidal ideation also appears to be a factor in online radicalization. Lisa Sugiura notes that many of the men she interviewed while researching the "incel" community were first drawn into that world through unrelated or apolitical online material, before the algorithm turned their heads toward darker stuff. One interviewee, she said, had done a "simple Google search" about male pattern baldness and eventually ended up on "incel forums, which were heavily dissecting and debating whether being bald is an incel trait."

That man became an incel "very much through the algorithm," Sugiura said, and through online conversations with people who "showed him a different way to view the world."

"Pathologies like eating disorders and suicidality exist on a continuum with radicalization," said Brian Hughes, the American University scholar. "In a lot of cases, they're co-morbid. Depression and radicalization are commonly seen together." Just as online merchants hawking dangerous diet products exploit young women's insecurities, he added, the world of far-right influencers displays "an obsession with an idealized masculine physique, which often leads to steroid abuse."

The most famous example of that phenomenon is Andrew Tate, a British influencer currently being held by Romanian authorities on charges of rape and human trafficking. Tate's alleged victims say he choked them until they passed out, beat them with a belt and threatened them with a gun. A former kickboxer, Tate has made a fortune by showing off his muscular physique and expensive toys, gizmos and gear to attract a massive online following of young men, promising that he can turn them into "alpha males." Tate has become so popular with boys and young men in the English-speaking world that educators are organizing and sharing resources in an effort to combat his influence. 

"There's been a huge increase in rape jokes that the boys are making," a seventh-grade teacher in Hawaii told Education Week

"Pathologies like eating disorders and suicidality exist on a continuum with radicalization," said Brian Hughes of American University. "Depression and radicalization are commonly seen together."

Conspiracy theories and right-wing propaganda often hook people, as Tate does, by appealing to anxiety and insecurity, especially regarding hot-button issues like race, gender and status. In his legal brief in the case of Steven Carrillo, Hughes explained that the murderer "was gratified by the feelings of anger and indignation" from far-right videos he saw on Facebook and "was rewarded with more extreme, more angering content." (Carrillo pleaded guilty to murder and eight other felony charges last year, and is serving a life sentence without parole.)

"Facebook algorithms would encourage Carrillo to join a Facebook group called '/K/alifornia Kommando,'" Hughes wrote. Once there, "his deterioration increased at a terrific speed. He fully embraced the new identity of Boogaloo revolutionary."

Jason Van Tatenhove understands how that process works. A former member of the Oath Keepers, he offered dramatic testimony before the House Jan. 6 committee last year, explaining how leaders convinced their followers to join the insurrection on Trump's behalf. In his book "The Perils of Extremism: How I Left the Oath Keepers and Why We Should Be Concerned about a Future Civil War," Van Tatenhove details how he first got sucked into the group, and what it took for him to get out.

Jason Van Tatenhove, who served as national spokesman for the Oath Keepers testifies on July 12, 2022, in Washington. (Kevin Dietsch/Getty Images)

"There's kind of a formula to what we were doing," said Van Tatenhove, who was hired to do communications work by Oath Keepers leader Stewart Rhodes, who was recently convicted of seditious conspiracy and various other charges, and sentenced to 18 years in prison. "We were always watching the news aggregates. We would set up Google alerts on certain keywords," in order to tailor recruitment content to what potential prospects were seeking out, especially on social media.

"What were the issues that really got people outraged and angry? Because that's the low hanging fruit," Van Tatenhove added. "We were looking for that outrage and that anger, because it seems to short-circuit our critical thinking centers."

That's exactly the kind of content that Facebook's algorithms have favored, according to Frances Haugen. "When you give more distribution to content that can get reactions," she said in a recent MSNBC interview, "you end up rewarding more extreme content. Because the shortest path to 'like' is anger." 

Van Tatenhove says he is personally in recovery from drug addiction and, during our conversation, compared the allure of conspiracy theories to that of heroin. "While heroin feels great, it ruins your life," he said. "While conspiracy theories feel great and we get all those chemical releases — much like shooting heroin — it's damaging to our country and it's damaging to our democracy." He suggested that the U.S. needs an analogue to "methadone" for our conspiracy-theory addiction.

What is to be done?

There's clearly no magic-bullet solution to this problem, but the growing number of lawsuits by parents and schools could be the beginning of one. None of these plaintiffs seek a cold-turkey approach to social media, which would neither be possible nor desirable. As Murthy told the "Offline" podcast, it's clear that many people "find community" on social media they might not find elsewhere, and that was doubly true during the pandemic. Social media, he added, can be especially beneficial for those "from a historically marginalized group where it's difficult to find people who may be going through similar experiences," such as LGBTQ youth. 

"Conspiracy theories feel great and we get all those chemical releases, much like shooting heroin," said Jason Van Tatenhove. But "it's damaging to our country and it's damaging to our democracy."

As the complaint in the California parents lawsuit argues, the issue is not the existence of social media but the companies' reliance on an "algorithmically-generated, endless feed to keep users scrolling in an induced 'flow state'; 'intermittent variable rewards' that manipulate dopamine delivery to intensify use; 'trophies' to reward extreme usage" and other features that keep people overstimulated and unable or unwilling to log off.  

The desired outcome here is simple: Perhaps the financial threats from these lawsuits will induce social media companies to make their products less addictive. Haugen has repeatedly argued that the path forward will require some form of government regulation. But the two approaches aren't separate but intertwined, the lawyers who spoke to Salon emphasized. They believe their lawsuits can generate public attention and information that will help shape both future regulation and the public will to enact it. 

The most promising legislative effort is a bipartisan bill introduced by Sen. Brian Schatz, D-Hawaii, and co-sponsored by another liberal Democrat, Chris Murphy of Connecticut, and two conservative Republicans, Katie Britt of Alabama and Tom Cotton of Arkansas. The Protecting Kids on Social Media Act, like a similar state law in Utah, would bar kids under 13 from starting social media accounts and require parental consent for older teenagers. That is almost certainly unenforceable but, more intriguingly, the bill also takes aim at online algorithms, barring companies from using them to drive content to minors, although it would remain legal to use them on adults.

"While kids are suffering, social media companies are profiting," Schatz said in a statement. "Our bill will help us stop the growing social media health crisis among kids by setting a minimum age and preventing companies from using algorithms to automatically feed them addictive content based on their personal information." Of course, as the families of QAnon believers and the victims of right-wing extremist violence can attest, it's not just kids who are going down dangerous online rabbit holes. Schatz conceded as much in a Wired interview, saying that "it's bad enough that it's happening to all of us adults," but that "the least we can do is protect our kids."

Regulating social media may be cast as an attack on the First Amendment, but Sen. Brian Schatz argues that there's "no free speech right to be jammed with an algorithm."

The fate of that bill is uncertain, and there's considerable resistance in both parties to pursue direct federal regulation of social media, which would surely be cast by the social media industry and some civil liberties advocates as an attack on First Amendment rights. Schatz has argued, however, that there is "no free speech right to be jammed with an algorithm." Restricting the use of algorithms wouldn't impact anyone's constitutional right to express their opinions on social media. It would simply prevent the companies from shoving the most incendiary content at the most vulnerable users.

These questions of free speech and social media algorithms are tricky, and the legal and legislative arguments are in their early stages. When the Supreme Court was confronted earlier this year with a case arguing that social media companies should be liable for algorithmic promotion of pro-terrorist content, the justices seemed grateful to punt the entire issue.

Legal experts hope that the consumer protection argument driving this new wave of social media lawsuits will create a framework for addressing this issue that sidesteps First Amendment concerns. The recent Supreme Court case looked at "whether a company can be held accountable for specific content," attorney Anne Murphy said. "We're looking at it much more holistically," by considering whether the algorithm should exist in its current form at all, considering the demonstrable harm it does to children.

The core idea here is to shift the focus from individual speech, which everyone agrees is protected by the Constitution, to the larger question of regulating the social media business model to mitigate its effects on public health. If the legal groundwork can be laid in these cases on the principle that children and teenagers should be protected from predatory business practices and deliberately harmful products, that could open the door to a larger regulatory structure that makes the internet safer for everyone. 

This entire discussion is still in its infancy, and more needs to be learned about the way social media affects mental health. As mentioned above, there's considerable debate over whether the "addiction" model is a fair or accurate way to describe what happens when people get sucked deep into online rabbit-hole communities that encourage destructive behavior or ideologies. As the lawyers who spoke to Salon almost universally expressed, they hope this litigation can help illuminate some of these issues and drive more resources toward the necessary research.  

Still, the family members of right-wing conspiracy believers have little doubt that social media had a profoundly debilitating effect on their loved ones, in many cases people who were vulnerable to the algorithm-driven pressure to stay online. Some people who dive down the rabbit hole may be struggling with long-term trauma or undiagnosed mental illness. Many are extremely lonely. In some cases, they have pre-existing prejudices and are eager to have their bigoted views validated by disinformation. Whatever the root causes may be, social media platforms are profiting handsomely off people who are slowly losing their minds to the reality-distortion field of their feeds. No one suggests these lawsuits can solve the problem by themselves, but they could mark the beginning of a major public reckoning with the harms of social media that leads, eventually, to real answers. 

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.