Facebook will try to continue to operate in Russia to offer a platform for “counter-speech” to the “propaganda” coming from Russian-state controlled media, Sir Nick Clegg has said.
The former deputy prime minister, who is now the head of global affairs at Facebook’s parent company, Meta, said it is important for the platform to stay online in Russia to allow users to “access information and speak out” against the country’s invasion of Ukraine.
Facebook has restricted access to the accounts of some Russian state-backed media outlets, and Sir Nick confirmed it would be demoting content from those organisations too, but said it would not be right to withdraw service from the country.
Russia has limited access to Facebook and slowed the platform in response to the social network’s restrictions, with Sir Nick saying the firm had seen some “degradation” of video content in particular.
But he said the platform’s priority is to provide services for “ordinary people to express themselves” and that “the most powerful antidote to propaganda is not only restricting circulation but circulating the answers to it”.
“That is why we always want to strike the right balance to allow the flow of counter-speech to continue on our services, where we are in a position to influence that,” he said.
“We passionately believe as a company that we provide services which are free to use for billions of ordinary people around the world to express themselves when they want, how they want, where they want – within the constraints of the law and our community standards.
“And, in the long run, I think history and experience suggest the thing that really undermines propaganda is counter-speech.
“You can, of course, suppress, restrict, block and we do all of that – we’re being transparent about doing that – but I do think at a moment of heightened conflict it is always important to remember that it is free expression that, in the end, we should be seeking to help win out more than anything else.”
Sir Nick spoke to reporters as Meta published its latest quarterly Community Standards Enforcement Report, which showed Facebook took down more than one billion pieces of spam content in the last three months of 2021.
The company said the prevalence of harmful content on both Facebook and Instagram had “remained relatively consistent” during the period – decreasing in many areas but rising slightly in others.
According to the figures for October to December last year, Facebook took action against 1.2 billion pieces of spam content, four million pieces of drugs content and 1.5 million pieces of firearms-related content – all of which were increases on the previous quarter, which the company said was down to improved proactive detection technologies.
The report showed that the prevalence of bullying and harassment content had dropped to around 12 views per 10,000 content views, compared to around 15 per 10,000 in the previous quarter.
However, the figures suggest a slight increase in the prevalence of nudity and sexual activity content on Facebook to three views per 10,000, compared to between two and three in the quarter before.
The social network’s vice president of integrity, Guy Rosen, said the “vast majority of content that users encounter does not violate our policies”.
“We’ve made a lot of progress, but even with that there will always be examples of things we miss and, with the scale of our enforcement, there will be examples of things we take down by mistake. There is no ‘perfect’ in this kind of work,” he said.