Facebook is banning hundreds of accounts affiliated with the conspiracy theory QAnon. | Stephen Maturen/Getty Images
For years, the conspiracy theory has spread unchecked on the social media network, amassing millions of followers.
After facing mounting pressure, Facebook is finally taking down nearly 800 Facebook Groups tied to QAnon, a wide-ranging set of pro-Trump conspiracy theories that have been linked to numerous acts of violence.
Facebook announced in a blog post Wednesday that it has taken down 790 QAnon-related Facebook Groups. The company is also removing 100 Pages and 1,500 ads tied to QAnon and imposing restrictions on more than 1,950 Groups and 440 Pages on Facebook and more than 10,000 accounts on Instagram.
In the same post, Facebook announced that it has also restricted more than 980 Facebook Groups and 520 Facebook Pages linked to “militia organizations and those encouraging riots,” including some who may identify as antifa.
These moves come as part of broader changes to Facebook’s rules to more tightly restrict dangerous groups on its platform. As part of the changes, Facebook will limit the reach of groups that have “demonstrated significant risks to public safety,” as QAnon has, but do not meet the criteria for an all-out ban. Facebook also says it will minimize the influence of these groups by removing them from the platform’s recommendation algorithms, reducing their ranking in the News Feed and search and banning them from fundraising and advertising.
“We already remove content calling for or advocating violence and we ban organizations and individuals that proclaim a violent mission,” the company wrote in a blog post. “However, we have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior.”
Born in 2017 on the internet forum 4Chan, QAnon is a collection of false conspiracy theories that broadly claim Trump is battling a group of Satan-worshipping pedophiles, including a “deep state” of officials within the US government, who are trying to plot against him. In May 2019, the FBI identified QAnon as a potential domestic terrorism threat. Some QAnon followers have committed acts of violence inspired by the theory, including attempted arson and domestic terrorism, although it appears to be a small minority of believers currently engaging in violence.
Facebook’s actions come after months of misinformation experts pressing the company to limit the spread of QAnon on its platform. Recent reporting revealed that the group has gained millions of followers on Facebook. The New York Times found that some of the largest QAnon groups saw their “likes,” comments, and shares of posts rise by as much as 300 percent in the last six months.
A month ago, Twitter took sweeping actions to limit QAnon on its platform by booting thousands of accounts, blocking QAnon from showing up in its trending topics and search, and banning users from posting links affiliated with QAnon theories.
While QAnon was once considered fringe, it has quickly entered the mainstream. Trump has repeatedly promoted QAnon followers’ social media accounts and declined to denounce the theory when asked about it. Marjorie Taylor Greene, a Republican from Georgia, may be the first QAnon supporter elected into the House of Representatives if she wins the general election this fall, worrying some political insiders who fear that QAnon could mimic the rise and influence of the Tea Party.
The rise of QAnon from the outskirts of the far-right to a potentially acceptable faction of the Republican party is unprecedented. While Facebook’s actions may make it harder for people to spread QAnon on social media, the company can’t undo the influence the group has gained more broadly in the established political community.
Will you become our 20,000th supporter? When the economy took a downturn in the spring and we started asking readers for financial contributions, we weren’t sure how it would go. Today, we’re humbled to say that nearly 20,000 people have chipped in. The reason is both lovely and surprising: Readers told us that they contribute both because they value explanation and because they value that other people can access it, too. We have always believed that explanatory journalism is vital for a functioning democracy. That’s never been more important than today, during a public health crisis, racial justice protests, a recession, and a presidential election. But our distinctive explanatory journalism is expensive, and advertising alone won’t let us keep creating it at the quality and volume this moment requires. Your financial contribution will not constitute a donation, but it will help keep Vox free for all. Contribute today from as little as $3.
via Vox – RecodeRecode, tech