Jan 07, 2025
(The Hill) - Social media giant Meta announced a series of changes to its content moderation policies Tuesday, including the elimination of its fact-checking program, in what CEO Mark Zuckerberg said was an effort to embrace free speech. "We're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms," Zuckerberg said in a video posted Tuesday morning. "More specifically, here's what we're going to do. First, we're going to get rid of fact-checkers and replace them with community notes similar to X, starting in the U.S." The changes mark a major move for the parent company of Instagram and Facebook and follows a series of other changes the company has made in recent weeks as President-elect Trump heads into his second term later this month. Zuckerberg cited the recent election as a driving force in the company’s decision, slamming “governments and legacy media” as pushing the company to “censor more and more.” “The recent elections also feel like a cultural tipping point towards, once again, prioritizing speech,” he said. “So we’re going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms.” The changes will be implemented on both Facebook and Instagram, along with Threads, which hosts billions of users every day.   Meta's years-long fact-checking program enlisted the help of third-party fact-checkers that moderated posts in more than 60 languages. The company stated the practice eventually became too restrictive of posts over the years. The platform will move to a community-based program called “Community Notes,” akin to the system deployed by tech billionaire Elon Musk when he purchased the platform X, then known as Twitter, in 2021. Meta platforms will now rely on users to send in notes or corrections to posts that are potentially misleading or need more context. Joel Kaplan, Meta’s newly named global policy chief, said the platform saw this approach “work on X,” and emphasized that the social media network gives users the power to decide the context other users should be seeing. “We think this could be a better way of achieving our original intention of providing people with information about what they’re seeing – and one that’s less prone to bias,” he wrote in a statement Tuesday.   “We want to undo the mission creep that has made our rules too restrictive and too prone to over-enforcement,” Kaplan added. Kaplan made clear Meta will not write Community Notes or decide those that make it onto the platform but will rely on ratings by contributing users. As on X, Meta’s Community Notes will require agreement between users “with a range of perspectives to help prevent biased ratings,” Kaplan said. The feature will be phased in the U.S. first over the next couple of months and will be improved over the course of the year, according to Kaplan. Meta executives hinted at frustrations with its content moderation policies last month, stating its error rates can be “too high” and lead to harmless content getting taken down by accident.   The content moderation polices were rolled out in recent years in response to mounting scrutiny of Meta and other major social media companies to prevent hateful or misleading information from spreading on their platforms. Zuckerberg in August told the House Judiciary Committee he regrets not being more outspoken about “government pressure” to remove content related to COVID-19.     Zuckerberg, in a letter to the committee, said Biden administration officials “repeatedly pressured” Meta to “censor” content in 2021 and vowed to push back should something similar happen again.   In addition to the elimination of the fact-checking program, Zuckerberg announced Meta will also alter its content policies regarding some divisive issues such as immigration and gender. “What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas, and it’s gone too far,” he said. “So I want to make sure that people can share their beliefs and experiences on our platforms.” Meta will also move its trust and safety and content moderation team from California to Texas, where there is “less concern about the bias of our teams,” Zuckerberg said.   The changes are yet another indication Meta is attempting to court Trump, who has become a close ally of Musk in recent months. Meta dished out a $1 million donation to Trump’s inaugural fund last month, while Zuckerberg met with the president-elect at his Mar-a-Lago resort in Florida. Zuckerberg on Tuesday pledged to work with Trump to “push back on governments around the world that are going after American companies and pushing to censor more.” “The only way we can push back on this global trend is with the support of the U.S. government, and that’s why it’s been so difficult over the past four years, when even the U.S. government has pushed for censorship by going after us and other American companies,” he said. Kaplan, a prominent Republican lobbyist, was named to replace Nick Clegg as the company’s chief global affairs officer last week while the company announced this week that Ultimate Fighting Championship (UFC) CEO and President Dana White, another Trump ally, will join the company’s board of directors.  
Respond, make new discussions, see other discussions and customize your news...

To add this website to your home screen:

1. Tap tutorialsPoint

2. Select 'Add to Home screen' or 'Install app'.

3. Follow the on-scrren instructions.

Feedback
FAQ
Privacy Policy
Terms of Service