More on KentOnline
Facebook will recommend problematic groups less and alert users if they are about to join one that has previously violated rules in a fresh attempt to curb misinformation.
The social network is redoubling efforts against potentially harmful groups which can be a hotbed for falsehoods, particularly of concern during the pandemic.
A series of policy updates are being rolled out globally over the coming months to reduce the “privileges and reach” of those who fall foul of the standards before removing them completely, the firm said.
Among them is an alert that will be shown to people before joining any groups that have infringed on the rules multiple times or had misinformation issues in the past.
We don’t believe in taking an all-or-nothing approach to reduce bad behaviour on our platform
When a group starts to commit breaches, the tech giant will show them lower in recommendations making it harder for people to discover, as well as appearing further down a user’s News Feed.
“We don’t believe in taking an all-or-nothing approach to reduce bad behaviour on our platform,” said Tom Alison, vice president of engineering at Facebook.
“Instead, we believe that groups and members that break our rules should have their privileges and reach reduced, and we make these consequences more severe if their behaviour continues – until we remove them completely.
“We also remove groups and people without these steps in between when necessary in cases of severe harm.”
Group admins and moderators will also have to temporarily approve all posts when that group has a “substantial” number of members who have violated policies or were part of other groups that were removed for breaking our rules.
Admins that repeatedly allow posts that should be disapproved risk causing the group to be taken down entirely, Facebook warned.
Accounts that continue to cause trouble will be blocked from posting or commenting “for a period of time” in any group, as well as losing the ability to invite others to groups and create new ones.