Meta, the parent company of Facebook and Instagram, announced on Tuesday that it will be discontinuing its third-party fact-checking program in favor of a new Community Notes initiative. This new program will involve users contributing to fact-checking efforts, similar to the model used by Elon Musk's social media platform X.
The decision to end the fact-checking program with independent third parties was made by Meta due to concerns about biases among expert fact checkers and the high volume of content being fact-checked. The company will now rely on a Community Notes model that leverages crowdsourced fact-checking contributions from users.
Meta's Chief Global Affairs Officer, Joel Kaplan, highlighted the success of a similar approach on X, where the community plays a role in identifying potentially misleading posts that require additional context.
The Associated Press had previously participated in Meta's fact-checking program but ceased its involvement a year ago.
In addition to the changes in fact-checking procedures, Meta also announced plans to relax restrictions on certain topics that are part of mainstream discussions. The company aims to focus more on addressing illegal activities and high severity violations such as terrorism, child sexual exploitation, and drug-related content.
Meta acknowledged that its content management systems had become overly complex, leading to errors in content moderation. CEO Mark Zuckerberg attributed some of these changes to recent political events, including Donald Trump's presidential election victory.
Zuckerberg emphasized a shift towards prioritizing free speech, particularly in light of recent cultural shifts. The company's quasi-independent Oversight Board expressed support for the changes and expressed a willingness to collaborate with Meta to ensure that the new approach strikes a balance between effectiveness and promoting free speech.