Last month, Facebook – now renamed Meta – CEO Mark Zuckerberg announced the promotion of Nick Clegg, the former UK deputy prime minister, to lead Meta on all policy and public relations matters globally.
What will Clegg do with his new position? If history is any guide, he’s likely to continue to be one of Facebook’s fiercest defenders, often using absurd, hypocritical arguments to uphold Meta’s worst actions.
In 2020, Clegg claimed Facebook merely “holds up a mirror to society,” while ignoring that Meta designs its algorithm to reward the most extreme and polarising content. My disclosures to US Congress and the Securities and Exchange Commission confirmed that political parties across Europe – on the right and on the left – found Facebook algorithm changes in 2018 forced them into more extreme political positions. In democratic societies, one could say Facebook votes before we do. And in war zones and fragile societies with weak law and order, Facebook can get people killed.
Then, in March 2021, Clegg subtly shifted his argument to say users and Facebook’s algorithms coexist in a symbiotic relationship similar to “taking two to tango”. By acknowledging the algorithm’s role in how Facebook operates, Clegg renewed a difficult question for the company: why are those algorithms hidden from the public?
My disclosures validated years of alarms raised by advocates – that the Facebook algorithm harms children, stokes division, and weakens our democracies.
And as we enter the “fog of war” with Russia’s invasion of Ukraine, we are seeing in real time how Russia is weaponising Facebook to spread its outrageous propaganda.
I’m an optimist by nature. I believe people such as Nick Clegg can choose to right these wrongs. And these five steps can make Facebook safer and more humane for its nearly 3 billion users worldwide.
1. Stop giving autocrats free rein to manipulate free societies
In 2018, Mark Zuckerberg promised US Congress that Facebook would invest in identifying when Russia and other countries deployed influence networks to distort reality and amplify lies. Facebook has the technology to detect coordinated disinformation campaigns but radically understaffs the teams responsible for taking them down – giving free rein to bad actors on these platforms. It is unacceptable that Facebook’s solution to the weaponisation of its platforms is to ask the targets of these campaigns – such as the people of Ukraine – to lock down their own accounts when the real problem is Facebook’s unwillingness to invest adequately in securing its own platform.
2. Stop entering fragile societies without due diligence
Facebook’s prioritisation of profit shows most in its rapid expansion in societies around the globe, often without thought to the consequences for people living in them. It does not scale its safety systems alongside this growth.
Consider Myanmar. Facebook admitted it had failed to stop horrific hate speech on its platform. The UN concluded Facebook played a “determining role” in fuelling genocide against the Rohingya ethnic group. Nonetheless, Facebook refuses to provide meaningful remedies to the Rohingya community for fuelling this violence.
Organisations such as Human Rights Watch rightly advocate that tech companies should conduct comprehensive assessments of how human rights may be harmed before they expand products into fragile parts of the world.
In western Europe and the US, we take for granted access to an open and independent internet. Facebook must stop choking the open web as it’s born by bribing users to use Facebook over open alternatives. Otherwise, when Facebook falls short, people will have nowhere else to turn.
3. Truth and transparency – today
I came forward because of a frightening truth: almost no one outside Facebook knows what happens inside Facebook. In 2020, Mark Zuckerberg claimed that Facebook’s algorithm took down “94% of hate speech”. But internal research from March 2021 showed the company catches a minuscule 3-5% of hate speech and only 0.6% of violence-inciting content.
Nick Clegg now can stop the lies. He must live up to his claims that Facebook will make itself more transparent. Clegg can take a simple step today: open up the algorithms for researchers to properly assess the platform. Then, groups such as New York University’s Cybersecurity for Democracy research centre, cut off from access to data by Facebook while analysing the spread of misinformation, could actually see how the algorithm works.
4. Design for safety, not censorship
Facebook’s policy and PR playbook has centred public debate on false choices between censorship and freedom of expression. Its PR team touts billion-dollar investments in content moderation but ignores internal studies showing that potential product design choices, such as limiting the number of reshares, would have nearly the same impact as the third-party fact-checkers – without picking winners or losers in the marketplace of ideas.
5. Fix the real harms – in the real world
Last autumn, Facebook rebranded as Meta as part of a shift toward the “metaverse”, a virtual reality company. It insults people harmed by Meta’s current platforms that the company is pivoting huge resources to video games when its own internal research shows 13.5% of teenage girls in the UK said Instagram made thoughts of suicide or self-injury worse and 17% said the platform makes their eating issues worse. Fixing that should be the priority.
The issues we face in the two-dimensional world of Facebook won’t miraculously disappear in the three-dimensional world of the metaverse, which already faces complaints of groping and harassment. Clegg must demand that Facebook fix today’s problems before he creates tomorrow’s monstrosity.
I never wanted to be a whistleblower, but I realised that without the light of day on these insidious practices, we would never get the social media we deserve – social media that promotes our wellbeing and protects our democracies.
Mr Clegg, this is your legacy-making moment. You have enormous power to do good in the world. It is my sincere hope that you take this moment to create the much-needed change we all deserve from the company that promises to bring us closer together – but is stuck in a profit-driven cycle of tearing us apart.
Frances Haugen is a former Facebook product manager, whistleblower and an advocate for accountability and transparency in social media