Meta announced Tuesday that it is abandoning its third-party fact-checking programs on Facebook, Instagram, and Threads. They are replacing their paid moderators with a Community Notes model that mimics X’s volunteer program. This model allows users to flag content they believe to be incorrect or misleading.
Meta's Decision to Shift Focus
In a blog post, Meta’s chief global affairs officer Joel Kaplan stated that this decision was made to allow more topics to be openly discussed on the platforms. The change in moderation will first impact the US audience.
Impact on Content Moderation
Meta CEO Mark Zuckerberg mentioned that the new policies would allow more political content and posts on divisive issues to appear on users' feeds. The company aims to simplify content policies and remove restrictions on topics like immigration and gender.
The decision to roll back fact-checking and change moderation policies is a significant shift from measures put in place after influence operations were conducted on the platforms in 2016. Meta received criticism for its hands-off approach to content moderation during high-profile elections.
Response to Fact-Checking Experts
Kaplan criticized fact-checking experts for their biases, which led to over-moderation of content. He mentioned that the content moderation policies were implemented in response to societal and political pressures.
Concerns Raised by Critics
However, critics have raised concerns about the impact of Meta’s decision on media organizations that partner with them for fact-checking. This move could have a negative effect on journalism and trust in information shared on the platforms.
Some critics view Meta’s decision as a retreat from responsible content moderation and an attempt to cater to specific political interests. The Real Facebook Oversight Board criticized the move as a step towards promoting far-right propaganda.
Community Notes Model
The Community Notes feature will rely on volunteers to write notes on posts. Other volunteers will need to approve these notes for them to be visible to all users. This approach aims to provide context and prevent biased ratings.
While Meta believes that this model will empower the community to identify misleading content, similar initiatives on other platforms have shown mixed results in combating disinformation and hate speech.
Meta, Facebook, Fact-Checking, Content Moderation, Social Media