Facebook has announced it will actively seek out and delete QAnon, the baseless conspiracy theory that paints US President Donald Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and “deep state’ government officials.
The company said on Tuesday (local time) that it will remove Facebook pages, groups and Instagram accounts for “representing QAnon” – even if they don’t promote violence.
Facebook previously relied on user to reports to target any associated pages.
The social media giant did not immediately explain what it means for Facebook groups to “represent” QAnon.
The move comes as the company, along with Twitter, takes action against a post by Mr Trump post likening coronavirus to the flu.
Less than two months ago, Facebook said it would stop promoting the group and its adherents, although it faltered with spotty enforcement.
It said it would only remove QAnon groups if they promote violence. That is no longer the case.
The company said it is starting to enforce the policy as of Tuesday but cautioned that it “will take time and will continue in the coming days and weeks”.
The QAnon phenomenon has sprawled across a patchwork of secret Facebook groups, Twitter accounts and YouTube videos in recent years.
QAnon has been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax.
But the conspiracy theory has also seeped into mainstream politics. Several Republican running for Congress this year are QAnon-friendly.
By the time Facebook and other social media companies began enforcing – however limited – policies against QAnon, critics said it was largely too late.
Reddit, which began banning QAnon groups in 2018, was well ahead, and to date it has largely avoided having a notable QAnon presence on its platform.
Twitter did not immediately respond to a message for comment on Tuesday.