(Reuters) – Facebook Inc Facebook said on Wednesday it had removed nearly 800 QAnon conspiracy groups for posts celebrating violence, showing intent to use weapons, or attracting followers with patterns of violent behavior, expanding its policy on groups that the company believes pose risks to public safety.
The world’s biggest social network also imposed some restrictions on the remaining 1,950 public and private QAnon groups it could find, no longer recommending them to users and making them less likely to be discovered in searches.
At least hundreds of thousands of Facebook users belong to one or more QAnon groups, though Facebook declined to give more precise figures.
Facebook also removed 980 groups that it said encouraged rioting, a majority seen as generally right-wing militias but a substantial number identified as part of the leftist antifa movement.(bit.ly/325oQEN)
The long-anticipated takedowns come amid sustained criticism as QAnon activity has surged on Facebook this year in tandem with political gains. A vocal QAnon supporter last week won the Republican congressional primary in a conservative Georgia district, drawing praise from President Donald Trump. Asked publicly about QAnon beliefs, the Republican president would not fault them. Twitter previously removed hundreds of QAnon accounts.
QAnon began in the aftermath of the false “pizzagate” conspiracies that claimed ahead of the 2016 election that prominent Democrats were running a pedophile ring out of the basement of a Washington restaurant.
It centers on anonymous postings from someone using the nickname Q who claims to be a high Trump administration official. Q and his most-followed supporters idolize Trump and have asserted that Democratic and Hollywood elites worship the devil, eat children, and in some cases have already been executed after secret military tribunals and replaced by actors. The FBI identified it as potential source of domestic violence, and murders and kidnappings have been committed by believers.
The phenomenon has continued to grow, largely over social media, by encouraging the curious to do their own “research,” engaging them in the process. The algorithms at Facebook and other platforms then reward that engagement by steering other users to such groups.
Reporting by Joseph Menn in San Francisco. Additonal reporting by Neha Malara in Bengaluru; Editing by Paul Simao