Meta CEO Mark Zuckerberg claimed that the Biden administration applied significant pressure on Facebook to censor content related to COVID-19 vaccine side effects. “This hit the most extreme. I'd say it was during the Biden Administration when they were trying to roll out the vaccine program,” Zuckerberg said, speaking on The Joe Rogan Podcast, adding that while he supports vaccines, he was troubled by the demands.
He added, “I think that while they're trying to push that program, they also tried to censor anyone who was arguing against it.”
“They pushed us super hard to take down things that were honestly true,” Zuckerberg explained, stating that the administration urged Facebook to remove posts discussing vaccine side effects, regardless of their truthfulness. “They basically said, ‘Anything that says vaccines might have side effects, you need to take down.’ And I was just like, well, we’re not going to do that.”
The tech entrepreneur highlighted specific examples, including resistance to removing humorous content. “They wanted us to take down this meme of Leonardo DiCaprio talking about class action lawsuits for vaccines 10 years from now,” Zuckerberg said. “We said, no, we’re not going to take down humour and satire. We’re not going to take down true things.”
Zuckerberg recounted how tensions escalated after public remarks by President Biden. “Biden gave some statement, saying these guys are killing people. And, I don't know, then like all these different agencies and branches of government just like started investigating coming after our company. It was, it was brutal. It was brutal,” he said.
Zuckerberg emphasised Meta’s transparency during investigations into government censorship. “We produced all these documents, and it’s all in the public domain,” he said. “People from the Biden administration would call up our team, scream at them, and curse.”
Meta announced on Tuesday, January 7, that it will end its independent fact-checking program in the United States, replacing it with a user-driven “Community Notes” system. The initiative, similar to a feature on X (formerly Twitter), it said, will allow users on Meta’s platforms—Facebook, Instagram, and Threads—to flag misleading or context-lacking posts. This marks a significant shift from relying on external fact-checking organisations and experts.
In its announcement, Meta highlighted the challenges posed by the existing fact-checking model. The company noted that biases in expert opinions influenced what was fact-checked and how it was approached. “A program intended to inform too often became a tool to censor,” Meta stated. The company acknowledged that its efforts to moderate content had grown too extensive, leading to errors, user dissatisfaction and limitations on free expression.
The “Community Notes” system aims to return more control to users, addressing frustrations caused by the current moderation methods. Meta admitted that its expansive content management system often “got in the way of the free expression we set out to enable.” The new system intends to balance user empowerment with content reliability while avoiding the pitfalls of overreach.
As part of this transition, Meta will stop demoting fact-checked content. Instead of imposing full-screen warnings that users must click through, the platform will display labels offering additional context or information about posts. This change aims to inform users without obstructing their access to content.
Catch all the Business News , Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.