Meta ditches factcheckers and makes other major changes to moderation policies
Jan 07, 2025
In a number of sweeping changes that will significantly alter the way that posts, videos, and other content are moderated online, Meta will adjust its content review policies on Facebook and Instagram, getting rid of fact-checkers and replacing them with user-generated community notes, similar to Elon Musks X, CEO Mark Zuckerberg announced Tuesday.The changes come just before President-elect Donald Trump is set to take office and just after Zuckerberg met with Trump and donated a million dollars to his inauguration.Fact-checkers have been too politically biased and have destroyed more trust than theyve created, Zuckerberg said in a video announcing the new policy Tuesday. What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas, and its gone too far.Zuckerberg, however, acknowledged a tradeoff in the new policy, noting more harmful content will appear on the platform as a result of the content moderation changes.Metas newly appointed Chief of Global Affairs Joel Kaplan told Fox on Tuesday that Metas partnerships with third-party fact checkers were well intentioned at the outset but theres just been too much political bias in what they choose to fact check and how.The announcement comes amid a broader apparent ideological shift to the right within Metas top ranks and as Zuckerberg seeks to improve his relationship with Trump before the president-elect takes office later this month.Just one day earlier, Meta announced Trump ally and UFC CEO Dana White would join its board, along with two other new directors.Kaplan, a prominent Republican who was elevated to the companys top policy job last week, acknowledged that the Tuesday announcement is directly related to the changing administration.He said that theres no question that there has been a change over the last four years. We saw a lot of societal and political pressure, all in the direction of more content, moderation more censorship, and weve got a real opportunity. Now weve got a new administration, and a new president coming in who are big defenders of free expression, and that makes a difference.A significant reversalThe moderation changes mark a stunning reversal in how Meta handles false and misleading claims on its platforms.In 2016, the company launched an independent fact-checking program, in the wake of claims that it had failed to stop foreign actors from leveraging its platforms to spread disinformation and sow discord among Americans. In the years since, it continued to grapple with the spread of controversial content on its platform, such as misinformation about elections, anti-vaccination stories, violence and hate speech.The company built up safety teams, introduced automated programs to filter out or reduce the visibility of false claims and instituted a sort of independent Supreme Court for tricky moderation decisions, known as the Oversight Board.But now, Zuckerberg is following in the footsteps of fellow social media leader Musk, who after acquiring X, then known as Twitter, in 2022, dismantled the companys fact-checking teams and made user-generated context labels called community notes the platforms only method of correcting false claims.Meta says it is ending its partnership with third-party fact checkers and instituting similar, community notes.I think Elon has played an incredibly important role in moving the debate and and getting people refocused on free expression, and thats been really constructive and productive, Kaplan said.The company also plans to adjust its automated systems that scan for policy violations, which it says have resulted in too much content being censored that shouldnt have been. Its systems will now be focused on checking only for illegal and high-severity violations such as terrorism, child sexual exploitation, drugs, fraud and scams. Other concerns will have to be reported by users before the company evaluates them.Zuckerberg said Tuesday that Facebooks complex systems to moderate content have mistakenly resulted in too much non-violating content being removed from the platform. For example, if the systems get something wrong 1% of the time, that could represent millions of the companys more than 2 billion users.Weve reached a point where its just too many mistakes and too much censorship, Zuckerberg said. What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas and its gone too far.But Zuckerberg acknowledged that the new policy could create new problems for content moderation.The reality is this is a tradeoff, he said in the video. It means that were going to catch less bad stuff, but well also reduce the number of innocent peoples posts and accounts that we accidentally take down.