Meta has unveiled a major revamp of its content moderation strategy across Facebook, Instagram, and Threads. They're basically returning to their original goal of letting people speak more freely. They’re cutting off their third-party fact-checking setup in the U.S. and replacing it with a community-driven notes feature, taking a cue from X’s approach.
According to CEO Mark Zuckerberg, Meta’s current moderation setup has led to too many posts being taken down—somewhere around 10-20 percent of removals might be mistakes. The new community notes system will require input from users with different viewpoints before any extra context is added to posts, which should help cut down on potential bias.
Their changes go beyond fact-checking. Meta’s also getting rid of rules that controlled discussions about things like immigration and gender identity. From now on, the automated scanning tech will mostly look out for illegal posts and significant violations like terrorism, child exploitation, drug content, fraud, and scams. Less serious problems won’t get flagged unless users report them.
In a big move, Meta is shifting its Trust and Safety and Content Moderation crews out of California to Texas and other places around the country. They’re also adjusting how they handle civic topics, which they have restricted since 2021. The platform will start giving people more tailored political content based on what they usually click on or react to. These updates are set to roll out in the United States over the coming months.
These changes mark the most significant shift in Meta’s policies since they began the fact-checking program in 2016, and they might change how billions of folks see and share content on Meta’s social platforms.
For a comprehensive guide on protecting your privacy, consider reading Mastering Digital Privacy: The Expert Guide to Staying Safe in the Age of Surveillance and Big Data.
Source(s)
Meta (in English)