Facebook Inc will no longer show health groups in its recommendations, the social media giant announced on Thursday, saying it was crucial that people get health information from “authoritative sources.”
Over the last year, the company took down more than 1 million groups that violated Facebook’s policies on misinformation and harmful content, it said in a blog post https://about.fb.com/news/2020/09/keeping-facebook-groups-safe.
Misleading health content has racked up an estimated 3.8 billion views on Facebook over the past year, peaking during the coronavirus pandemic, advocacy group Avaaz said in a report https://www.reuters.com/article/us-health-coronavirus-facebook/on-facebook-health-misinformation-superspreaders-rack-up-billions-of-views-report-idUSKCN25F1M4 last month.
Facebook, under pressure to curb such misinformation on its platform, has made amplifying credible health information a key element of its response. It also removes certain false claims about COVID-19 that it determines could cause imminent harm.]
Facebook said in the blog post that it also now limits the spread of groups tied to violence by removing them from its recommendations and searches, and soon, by reducing their content in its news feed. Last month, it removed nearly 800 QAnon conspiracy groups for posts celebrating violence, showing intent to use weapons, or attracting followers with patterns of violent behavior.
The world’s largest social network also said it would bar administrators and moderators of groups that have been taken down for policy violations from creating any new groups for a period of time.