Meta is ending its fact-checking program in the US and replacing it with a new "Community Notes" system, the company announced on Tuesday (January 7). This move mirrors a similar initiative seen on X, the social media platform owned by Elon Musk.
The new system will empower users on Meta's platforms—Facebook, Instagram, and Threads—to flag posts that may be misleading or lack proper context, shifting the responsibility for content moderation away from independent fact-checking organizations and experts.
"Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact check and how ... A program intended to inform too often became a tool to censor," Meta said in its statement.
The company acknowledged that its content moderation efforts had grown too large, resulting in mistakes, user frustration, and interference with free expression, which Meta initially aimed to promote.
Meta’s new approach is designed to give users more control, as the company stated that the current content management system had "expanded to the point where we are making too many mistakes, frustrating our users and too often getting in the way of the free expression we set out to enable."
"Experts, like everyone else, have their own biases and perspectives. This showed up in the choices some made about what to fact check and how ... A program intended to inform too often became a tool to censor," Meta said.
The "Community Notes" system will be rolled out in the US over the next few months, with the company promising to improve and refine the model throughout the year.
Also, Meta will stop demoting fact-checked content. Instead, the platform will display a label to inform users of additional information related to a post, replacing the current method of full-screen warnings that users must click through before seeing the post.
Catch all the Business News , Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.