Facebook said when a group starts to violate its rules, it will now start showing them lower in recommendations, which means it is less likely that people will discover them
Social media giant Facebook has announced key measures to prevent its interest-based forums called Groups, both public and private, from spreading hate speech and misinformation. The measures come after the platform faced criticism for its groups being linked to protests that led up to the Capitol riot in the US earlier this year.
"This is why we recently removed civic and political groups, as well as newly created groups, from recommendations in the US. While people can still invite friends to these groups or search for them, we have now started to expand these restrictions globally," Tom Alison, Vice President of Engineering at Facebook said in a blogpost.
The new changes will be rolled out globally over the coming months. Facebook said when a group starts to violate its rules, it will now start showing them lower in recommendations, which means it is less likely that people will discover them.
This is similar to its approach in News Feed, where the platform shows lower quality posts further down, so less number of people can see them.
"We believe that groups and members that violate our rules should have reduced privileges and reach, with restrictions getting more severe as they accrue more violations, until we remove them completely. And when necessary in cases of severe harm, we will outright remove groups and people without these steps in between," Alison said.
Facebook said that it will start to let people know when they are about to join a group that has "Community Standards" violations, so they can make a more informed decision before joining. It will also limit invite notifications for these groups, so people are less likely to join. For existing members, the platform will reduce the distribution of that group's content so that it is shown lower in News Feed.
Facebook said it will also start requiring admins and moderators to temporarily approve all posts when that group has a substantial number of members who have violated our policies or were part of other groups that were removed for breaking its rules.
This means that content would not be shown to the wider group until an admin or moderator reviews and approves it. If an admin or moderator repeatedly approves content that breaks its rules, Facebook will take the entire group down.
"When someone has repeated violations in groups, we will block them from being able to post or comment for a period of time in any group," Alison said.
Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.
Never miss a story! Stay connected and informed with Mint.
our App Now!!