The removals mark the first time Facebook has publicly announced an action against QAnon, a far-right conspiracy group that originated on 4Chan and has been banned on other platforms like Reddit. Facebook didn’t specify exactly who was behind the effort, only that the “investigation found links to individuals associated with QAnon.”
Facebook’s Head of Security Policy, Nathaniel Gleicher, said the accounts pre-date the coronavirus pandemic, but that “we’ve seen them opportunistically leverage COVID-related topics.”
“They frequently posted about news and topics including the upcoming presidential election and candidates, the current US administration, anti-Semitic and anti-Asian conspiracies, and COVID-19,” Facebook wrote in a report on the takedowns. “While it did not appear to be the focus of this campaign, some of the individuals behind this effort attempted to monetize their clickbait content by selling t-shirts and other merchandise.”
As NBC News points out, the pages represent only a “fraction” of QAnon supporters’ Facebook presence. While Facebook aims to debunk and hide conspiracy theories and misinformation, the company doesn’t remove posts that don’t otherwise violate their rules.
Together, the five pages, 20 accounts, and six groups reached more than 150,000 Facebook users, the company said. One hundred and thirty-three thousand users had followed at least one of the pages, and 30,000 had joined at least one of the groups. While that’s a tiny subset of Facebook’s 195 million US users, it still highlights how a small number of “fringe” accounts can gain relatively wide distribution on the social network.