Ahead of the 2024 election cycle, the world's largest tech companies are walking back policies meant to curb misinformation around COVID-19 and the 2020 election.
Why it matters: Social media platforms are arguing that the risk of harm no longer outweighs the benefits of political dialogue, drawing concerns from lawmakers and consumer advocacy leaders.
Driving the news: YouTube last week confirmed that it will reverse its election integrity policy to leave up content that says fraud, errors or glitches occurred in the 2020 presidential election.
- YouTube established the policy in December 2020, after enough states had certified the 2020 election results.
- The company said in a statement that leaving the policy in place may have the effect of "curtailing political speech without meaningfully reducing the risk of violence or other real-world harm."
Meta on Monday reinstated the Instagram account of Robert F. Kennedy Jr., who was removed from the platform in 2021 for posting misinformation about COVID.
- That content, Meta said at the time, violated its COVID misinformation rules. Meta also removed the Instagram and Facebook accounts for Kennedy's nonprofit, Children's Health Defense. Those accounts remain banned.
- Meta's spokesperson said that it reinstated the account because "he is now an active candidate for president of the United States."
- Kennedy Jr., the nephew of former president John F. Kennedy and a prominent anti-vaccination voice, announced his bid to take on President Biden in the 2024 Democratic primary. Kennedy Jr. tweeted last week complaining about being unable to register any Instagram accounts.
Be smart: Speech from political candidates has always been regulated differently in the U.S. on public networks.
- Broadcast channels, for example, can't censor or deny political ads for including falsehoods.
- Meta has long supported that idea and regulates speech from political figures and world leaders differently than everyday accounts. YouTube seems to be headed in a similar direction.
What they're saying: "The problem isn’t that election misinformation exists, but rather that social media as a product amplifies misinformation-at-scale, which makes it everyone’s problem now," said Joan Donovan, a misinformation expert and author of the book "Meme Wars."
Kathleen Hall Jamieson, director of the Annenberg Public Policy Center and founder of Factcheck.org, argued that with a few exceptions — including health threats and real-time incitement of violence — fact-checking is a stronger antidote to misinformation than blocking speech.
- The best solution, she argues, is to "flood the zone with the best available information, make sure that when the misinformation gets up there, you've got corrective context with good information up next to it."
Yes, but: Layoffs and cost-cutting measures at Big Tech firms have impacted the trust and safety staff on most major platforms leading up to the 2024 election.
Flashback: During and after the 2020 election, the Black Lives Matter movement and the pandemic, tech platforms instituted stricter misinformation policies to curb threats of real-world harm or violence.
- Several platforms, such as YouTube, Twitter and Reddit, banned prominent white supremacist accounts, like those of David Duke and Richard Spencer. (Twitter has since reinstated many of those accounts under Elon Musk.)
- After the 2020 Hunter Biden laptop saga, when social media sites blocked a New York Post story from spreading due to fears that it might be fraudulent or contained illegally obtained information, platforms began to tread more cautiously around restricting content.
The big picture: Since Musk bought Twitter and promised "free speech"-first policies, the service has become an attractive platform for politicians who feel disenfranchised by Big Tech.
- The company said it would no longer enforce its COVID misinformation policy last November, restored the accounts of prominent election deniers in January and rolled back its policies against misgendering people in April.
- In a tweet thread last week, Kennedy Jr. lauded Twitter for allowing "my campaign and me to have a voice" while Meta blocked his accounts. Shortly after, Kennedy Jr. confirmed he would participate in a Twitter Spaces audio event with Musk on Monday.
- Musk had said that he would offer any presidential hopeful the chance to join an event like the one he hosted for Florida Gov. Ron DeSantis.
What to watch: Some tech firms are also pulling back some of the restrictions they've traditionally placed on political ads.
- Twitter said earlier this year it plans to resume taking some political ads, after mostly blocking them beginning in 2019.
- Spotify brought back most political ads after a two-year ban last year.
- Disney said last year that Hulu will accept political issue ads — in addition to candidate ads — bringing Hulu's ad policies in line with those of Disney's cable networks.