In late 2019, during one of Mark Zuckerberg’s many trips to Washington to defend Facebook in front of Congress, he stopped for a private dinner with Donald Trump and offered the president a titillating statistic. “I’d like to congratulate you,” Zuckerberg said. “You’re No. 1 on Facebook.”
At least that’s the story as told by Trump, on Rush Limbaugh’s radio show in January. Trump is technically not the top politician by followers on Facebook. That would be former President Barack Obama. But as the country’s most powerful newsmaker and the person in charge of a government that’s been aggressively pursuing antitrust cases against big tech companies, he does have leverage over Zuckerberg. So the chief executive officer could be forgiven for flattering Trump. Any moment that the president is happy with Facebook is a moment he’s not pursuing hostile regulation—or more likely, sparking a bad news cycle.
Facebook Inc. declined to comment on whether Zuckerberg indeed told Trump he was No. 1 and, if so, in what category he meant, but it’s adamant that its founder isn’t playing favorites. After the New York Times speculated that the dinner between Zuckerberg and Trump might have involved a deal over whether Facebook would fact-check the president, Zuckerberg said he was simply stopping by the White House because he was in town. “The whole idea of a deal is pretty ridiculous,” he told Axios in July.
Longtime current and former employees say this denial may be a bit misleading. Zuckerberg isn’t easily influenced by politics. But what he does care about—more than anything else perhaps—is Facebook’s ubiquity and its potential for growth. The result, critics say, has been an alliance of convenience between the world’s largest social network and the White House, in which Facebook looks the other way while Trump spreads misinformation about voting that could delegitimize the winner or even swing the election. “Facebook, more so than other platforms, has gone out of its way to not ruffle feathers in the current administration,” says Jesse Lehrich, co-founder of Accountable Tech, an organization making recommendations to tech companies on public-policy issues. “At best, you could say it’s willful negligence.”
The pattern hasn’t been confined to U.S. politics. A Facebook executive in India was accused in August of granting special treatment to a lawmaker from Prime Minister Narendra Modi’s ruling Bharatiya Janata Party who’d called for violence against Rohingya Muslim immigrants. (It was only after the Wall Street Journal reported on the posts that the company banned the lawmaker, T. Raja Singh.) A memo from a former employee, published by BuzzFeed on Sept. 14, detailed how Facebook had ignored or delayed taking action against governments using fake accounts to mislead their citizens. “I have blood on my hands,” she wrote.
“You can continuously see the challenge of them trying to have these kinds of broad principles around free expression and stopping harm, and then that mixing with the realpolitik of trying to keep the executive branch happy,” said Alex Stamos, the former top security executive at Facebook, at a conference in June. Facebook executives say their only loyalty is to free speech. “The idea that there is systematic or deliberate political bias in our decisions I really don’t think is borne out by the facts,” says Nick Clegg, head of policy and communications. “Of course, there are isolated cases.”
Facebook executives often point out that the company was seen as overly friendly to Democrats during the Obama years and that it takes plenty of heat from the Right. In the summer of 2016, the tech website Gizmodo wrote that it had been directing employees to suppress pro-Trump sites in its trending news section. The story led to a scandal over supposed anticonservative bias at social media companies, and it was in response to this backlash that Facebook started to drift rightward. The company flew conservative commentators to its headquarters in Menlo Park, Calif., to reassure them that there was no need for concern about how Facebook operated.
The romancing continued after election day as Facebook celebrated Trump’s victory. In January 2017, the company co-hosted an inauguration party with the Daily Caller, which has published writers who have espoused white nationalist views. The lemons in the catered drinks were stamped with Facebook logos. An internal report from around the same time touted Trump’s superior strategy with Facebook ads, noting that Trump followed advice and training from the company that his opponent, Hillary Clinton, had rejected. Trump “got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser. Period,” Andrew Bosworth, who currently runs the company’s efforts in augmented and virtual reality and who was head of ads at the time, wrote in a memo to employees in 2018.
Facebook’s mostly liberal staff saw its Republican relationship-building as the price of doing business, but as the company weathered public scrutiny—about Russia’s spread of election misinformation, as well as its failure to stop Cambridge Analytica’s data-gathering operation—something changed. Among the rank and file, and even among some executives, the shift went from grudging professional admiration for Trump to a realization that he was using Facebook to attack many of the issues employees cared about. During the Supreme Court confirmation hearings for Brett Kavanaugh, during which Kavanaugh was accused of sexual assault, head of policy Joel Kaplan was shown seated directly behind the nominee, a close friend. The days following, several Facebook employees say, was the first time they saw colleagues cry openly at Zuckerberg’s weekly question-and-answer sessions.
After the Kavanaugh hearings, employees began to notice that Kaplan, George W. Bush’s former deputy chief of staff, seemed more concerned about critiques of bias from conservatives than from liberals. Facebook’s internal data showed that conservative voices are consistently the most popular on the site. (On a recent Monday morning, the top 10 Facebook posts, by interactions—such as likes, shares, and comments—included eight from conservative pundits and news outlets, one from Ivanka Trump, and one from NPR.) But in 2019, under pressure from the same group who’d visited before Trump’s election, Kaplan commissioned a yearlong independent study that concluded the opposite. “There is still significant work to be done to satisfy the concerns we heard from conservatives,” it said.
Historically, Facebook had given executives in charge of its products immense leeway in making decisions, but suddenly the company’s policy team seemed to have veto power. In January 2018, Zuckerberg asked to reduce the prevalence of news in users’ feeds, especially from incendiary and untrustworthy outlets. The product team tweaked the news feed, but then members of Kaplan’s team reviewed test simulations. They noted that the product change was causing traffic to drop more severely for right-wing outlets such as Fox News and Breitbart News, according to a person familiar with the incident who spoke with Bloomberg Businessweek on the condition of anonymity. Of course, this was because Fox and Breitbart tend to publish more incendiary content—Breitbart famously once had a “black crime” section on the site. So the engineers were ordered to tweak the algorithm a little more until it punished liberal outlets as much as conservative ones, before releasing the update to 2.5 billion users. Fox maintained its position as the top publisher on Facebook. This kind of review wasn’t unusual, employees say. The policy team would regularly investigate whether changes would affect right-leaning outlets while seeming less concerned about the effects on more left-leaning ones. A Facebook spokesman denies that Kaplan’s pushback was partisan and says the algorithm was not changed as a result of his concerns.
As employees started to worry about Facebook’s proximity to the Right, Facebook’s M-Team—“M” for management—seemed intent on pushing the company even closer to it. At one point, the group flew to New York for a leadership off-site at the headquarters of News Corp., which like Fox News is controlled by Rupert Murdoch and his family. One executive, Instagram co-founder Kevin Systrom (who left the company in 2018), refused to attend, citing Fox’s polarizing influence, according to a person familiar with the matter. The company says it regularly meets with media outlets. Systrom didn’t respond to a request for comment.
Eventually, Trump pushed the limits. In the early morning hours of May 29, he posted a message to his 29.5 million Facebook followers, warning protesters in Minneapolis that they were risking violent retribution. “When the looting starts, the shooting starts,” the president wrote. It was a formulation that’s long been associated with police brutality. A similar threat was used by the segregationist presidential candidate George Wallace.
Trump had said the same on Twitter, which quickly hid his post, saying it violated rules against glorifying violence. Zuckerberg waited, leaving the post up for hours while consulting his top lieutenants to discuss what to do. Chief Operating Officer Sheryl Sandberg, Kaplan, and Clegg all weighed in. So did Maxine Williams, the company’s head of diversity. And then, that afternoon, another very influential figure piped up: the president himself, who spoke to Zuckerberg by phone. Zuckerberg said later that he told Trump he disagreed with the post and found it unhelpful. But, crucially, he also didn’t think it went against Facebook’s rules.
Trump’s post remained on Facebook, sparking a virtual walkout. Employees began criticizing Zuckerberg openly and leaking to the press. “Mark is wrong,” tweeted Ryan Freitas, director of product design for the company’s news feed, “and I will endeavor in the loudest possible way to change his mind.” A flurry of stories appeared over the next two months detailing instances that reinforced the suspicions about the alliance between Facebook and Trump. For instance, media outlets reported that the president had no negative hashtags associated with his name on Instagram, while Joe Biden had lots; that a Facebook employee was fired after complaining that the company seemed to be allowing far-right pundits, such as Diamond and Silk, to break rules about misinformation; and that an investigation into Ben Shapiro, whose site the Daily Wire routinely broke the rules to boost its audience, was thwarted by Kaplan’s policy group.
Employees also noticed a difference between Zuckerberg’s relationship with Trump and his interactions with his Democratic opponent. In a June 29 letter addressed to Clegg, Biden’s campaign manager, Jen O’Malley Dillon, pointed out three instances where she felt Trump had shared voting misinformation and asked if Facebook “will apply its policies impartially.” In a separate letter on July 10, campaign general counsel Dana Remus accused Facebook of hypocrisy. “Your company’s actions fail to live up to its stated commitments,” she wrote. Zuckerberg hasn’t spoken to Biden all year.
After the looting-and-shooting post, Facebook’s policy team contacted the White House to explain the company’s process, which lead to Trump’s phone call with Zuckerberg. During the delay, the post got millions of views. Around the same time, Biden posted an open letter asking Facebook to stem the tide of misinformation. Facebook clapped back in public. “The people’s elected representatives should set the rules, and we will follow them,” it said in a blog post. “There is an election coming in November and we will protect political speech, even when we strongly disagree with it.” The message was clear: We listen to the government in charge.
By now, Facebook’s failures during the 2016 election are well known. A group backed by the Russian government used the company’s products to promote Trump and disparage Clinton, according to the report issued by special counsel Robert Mueller. For instance, Russian operatives created fake accounts aimed at Black voters, seen as a key part of Clinton’s base. They told people who followed these accounts they shouldn’t bother voting, or that they should vote by text message, which isn’t possible. In all, the Russian posts reached more than 150 million Americans.
The job of rooting out fake content created by foreign governments falls to Facebook’s election integrity and cybersecurity groups, which are separate from the policy team and, in theory, nonpartisan. Facebook has gotten better at finding these campaigns. Last year alone, it removed 50 networks of accounts like the Russian one from 2016. But some former employees have complained of being ignored or sidelined based on political concerns. In 2018, Yaël Eisenstat, a former CIA intelligence officer, worked on a plan to use software to scan ads for language that could give false information about voting procedures. The proposal was rejected, Eisenstadt was told, because the problem wasn’t urgent enough. She left that November.
The following year, Facebook did make rules against giving incorrect information about how to vote, but then it froze when Trump actually put the policy to the test. On May 20, a week before the looting-and-shooting post, the president claimed that officials in Michigan and Nevada were sending out mail-in ballots illegally, which was not true. A few days later, on May 26, Trump posted that California was mailing ballots to “anyone living in the state,” another lie. The posts stayed up, and Zuckerberg went on Fox News to criticize Twitter, which had fact-checked similar posts. An outside civil rights auditor later concluded that Facebook failed to enforce its own policies in both instances.
Instead Zuckerberg came up with something new—what he called “the largest voting information campaign in U.S. history,” a plan to register 4 million voters. Facebook also designed a “voting information center,” a webpage with facts about the election compiled from state authorities. The social media network has been promoting the page atop every user’s Facebook and Instagram feed and attaches a link to it with every post on the service that mentions the election process. The hub “ensures that people can see the post and hear from their elected officials, warts and all, but also have accurate context about what the experts are saying,” Nathaniel Gleicher, Facebook’s head of cybersecurity policy, told reporters in August.
But the links below Trump’s ever-more-frequent postings about voting do not warn Facebook users if the information is untrue—they simply advertise an information center. Moreover, after Republicans complained about the voter registration efforts, Facebook seemed to back off further, according to emails obtained by the Tech Transparency Project. The company had planned a two-day promotion over the July 4th holiday on Facebook, as well as on Instagram and Messenger, but then cut that down to a one-day push on Facebook alone.
Facebook has said that the suggestion that the company scaled down its voter registration plans for political reasons is “pure fabrication.” Another spokesman, replying to a Twitter user who suggested the same, responded with a picture of a woman in a tin foil hat.
The company, of course, knows lots about conspiracy theorists, who thrive on the site. There’s QAnon, a far-right movement that espouses a complex theory involving a cabal of elites engaged in child sex trafficking. The FBI deemed it a form of domestic terrorism in August 2019, but Facebook only started removing accounts in May. The company also initially ignored posts tied to a Kenosha, Wis., militia in which users discussed shooting Black Lives Matter protesters. The militia’s event page was flagged more than 400 times, but moderators allowed it to stay up, according to BuzzFeed. Not long after the posts began appearing, a 17-year-old with an assault rifle shot and killed two people at a protest in the city.
Even as employees accuse Facebook of aiding Trump’s reelection effort, the administration has kept up the pressure on the company. In late May the president signed an executive order threatening to revoke the immunity enjoyed by social media companies, including Facebook, under Section 230 of the Communications Decency Act of 1996, if they showed political bias. The order was an apparent threat to social networks that censored posts from Trump and his allies. Facebook responded by saying the move would restrict free speech.
Trump’s threats haven’t yet materialized into anything crippling—at least not for Facebook. The U.S. Department of Justice is preparing a case against the company’s main rival, Google, which it’s expected to file before Election Day. Meanwhile, Trump has forced another key Facebook competitor, Chinese-owned TikTok, to find a U.S. buyer or face ejection from the country.
So far, Zuckerberg’s cultivation of Trump has seemed to keep Facebook safe from the president’s ire. But Trump is trailing by 7 points or so nationally, and it’s likely that a Biden administration would seek to regulate Facebook. In July, Zuckerberg got a preview of the Democrats’ playbook when he faced the House Judiciary subcommittee on antitrust, alongside all the other major tech executives. Representatives’ questions for him were pointed, prosecutorial, and informed by thousands of internal emails and chat logs that seemed to suggest a path for regulators to argue that the company should be broken up or penalized in some other way. “All of these companies engage in behavior which is deeply disturbing and requires Congress to take action,” David Cicilline, the Rhode Island representative who’s the panel’s chairman, told Bloomberg in August, the same day Facebook’s stock hit a record high. He said he was especially struck by the “casual way” Zuckerberg admitted to buying Instagram and WhatsApp to eliminate them as competitors. Facebook disputes Cicilline’s statement and says both acquisitions have not harmed competition.
Biden, meanwhile, has said he also favors removing Section 230 protections and holding executives personally liable. “I’ve never been a big Zuckerberg fan,” he told the New York Times in January. Zuckerberg seems keenly aware of the risks of a Trump loss. He’s told employees that Facebook is likely to fare better under Republicans, according to people familiar with the conversations.
This isn’t to say Facebook wouldn’t adapt if Biden wins in November. In June, Zuckerberg announced he’d rehired Chris Cox, former chief product officer, who’d been active in Democratic politics since leaving Facebook last year. Cox is widely considered to be the most likely candidate to become CEO if the Facebook founder ever stepped down. “Nothing stays the same, of course not,” says Clegg, the VP for communications, when asked how Facebook would adjust to a future Biden administration. “We’ll adapt to the environment in which we’re operating.” Read next: Palantir Is Bad-Mouthing Big Tech While Taking Its Tech Public
©2020 Bloomberg L.P.