Facebook to Ban Groups to Curb Misinformation if Multiple Posts Violate Content Guidelines
In a bid to curb the spread of misinformation on its platform, Facebook has reportedly decided to track down and impose restrictions on groups that repeatedly allow misleading and incorrect posts to be published within them. According to a report by The Washington Post, Facebook's decision comes as numerous attempts to spread misleading information about the 2020 US Elections surfaced. As part of its efforts, Facebook will first identify groups that have a tendency to spread misinformation and fake news among their members, and impose a probation wherein voluntary administrators and moderators of such groups will be required to personally approve each post being made. If the moderators in question fail to check the spread of misinformation, or approve too many posts that are flagged as fake news, Facebook may also entirely shut down a group.
The move joins other measures adopted by Facebook in order to prevent the spread of misinformation on its platform, such as pausing all political ads about the 2020 US Elections, restricting the overall reach of posts that claim misleading allegations against the counting of votes, and also imposing a fact check label on claims regarding the verdict of the election. Incumbent President Donald Trump's posts on both Facebook and Twitter were repeatedly issued with this label, leading to his supporters claiming voter fraud, the opposing Democrats party "stealing" the election, and Trump himself claiming that he has "won this election, by a lot". No such evidence of voter fraud or other electoral disputes have been revealed by any credible source, and Democratic candidate Joe Biden has now been ruled as the President-elect.
It is also a major move by Facebook, for which group moderation was largely based on individuals raising complaints regarding posts made within a group. Its new policy will apply to both public and private groups, and any post that violates its guidelines will be automatically flagged. As a result, this would mean that moderates will compulsorily need to act on fake news, propaganda or mis-informative posts. This marks a significant move by Facebook to curb the impact that deliberate political propaganda can have on an electoral process.
Facebook's measures against fake news and misinformation comes in light of its role in allowing Trump and his aides and campaigners to use social media to major benefit during the 2016 US elections. Over the past four years, Facebook has been repeatedly called out for not taking enough actions against serious issues, such as preventing the spread of misinformation during election campaigns. More recently, Facebook faced considerable controversy in India after it was found to have allowed posts carrying hate speech and communally divisive rhetoric without imposing restrictions — an act that was deemed to be of clear political bias. With Facebook now cutting down on groups that monger propaganda based on misleading information, it is important to note that such an action would not just be vital during the 2020 US Elections, but during any major election campaign across the world.
Facebook's new strategy to put groups on probation is a temporary measure, and will impose restrictions on groups for a period of 60 days. During this period, groups will not have the option to appeal against the ban or revert it anyhow. However, if these groups are found violating Facebook's content and community guidelines even after or during this period, Facebook may move to ban groups permanently from their platform.