Facebook announced today that it is taking additional steps to limit the spread of harmful content and misinformation in Groups. The social networking giant is honing in on Groups that contain health-related posts and those that are tied to violence.
Tom Alison, Facebook's Vice President of Engineering, said the company has begun removing health groups from recommendations in order to give priority to authoritative sources of health information. That said, users can still invite others to health groups they belong to or search for this type of groups. The latest step comes after Facebook drew flak earlier this year for allowing people to share false information surrounding COVID-19, including conspiracy theories and fake remedies.
In addition, the company is taking more stringent steps to limit the scope of groups linked to violence by removing them from recommendations, reducing their presence in search, and limiting their content in News Feed. This builds on Facebook's ongoing efforts to crack down on groups that pose risks to public safety such as U.S.-based militia organizations and QAnon.
Facebook noted that it took down more than 1 million groups over the last year for violating its policies. It is now preventing repeat offenders from creating new groups for a period of time, though it's not clear how long that window lasts. For groups with no admin for some time, the company plans to archive them in the next few weeks. Admins who are about to step down can also invite members to become admins, and if nobody accepts the invitation, Facebook will suggest admin roles to members.