Culture

Facebook Bans 1 Million Groups in Fight Against Hate Speech & Conspiracy Theories

Lead Photo: Art by Alan Lopez for Remezcla
Art by Alan Lopez for Remezcla
Read more

Over the past year, more than one million Facebook groups containing hate speech have been shut down by the platform, the company reported yesterday. That included some 13.5 million pieces of problematic content dubbed either as hate speech or “organized hate.”

New policies have been implemented for Facebook Groups so as to combat not only hate speech, but also conspiracy theories and misinformation. Mashable reports that this is the first time Facebook has released stats like these, “concerning how the social media platform moderates what goes on” in these groups.

It’s Facebook’s AI that’s removing the bulk of this content; about 90% of either hate speech or organized hate content gets the auto-boot. Reported content makes up the rest, presumably, and any group that doesn’t remove infringing content can be deleted.

Groups without admins will be contacted by Facebook: They’ll suggest an admin, and if nobody takes on the role, the platform will “archive” the group, which simultaneously closes it off to new members and posts.

All this is essentially good news; pulling the plug on online communities saying antifa started the wildfires in California, for example, isn’t a bad thing in terms of promoting truth in these confusing times. Covid-19 is another, especially critical area where Facebook’s new policies can filter out misinformation.

But other problems–many affecting marginalized groups, like Facebook’s inconsistency of policies—folks’ accounts being suspended for posting “men are trash,” but no action taken on, say, content that maliciously, intentionally misgenders a trans person, or is blatantly racist while not using easily identifiable slurs—have yet to be remedied. Let’s hope that’s on the docket for the platform, because hate and organized hate are just as easily spread through these tactics, too.