Facebook has banned all accounts linked to the QAnon conspiracy theory movement from its platforms.
"Starting today, we will remove Facebook Pages, Groups and Instagram accounts," the company said on Tuesday.
The move is a significant escalation to Facebook's earlier decision to remove or restrict groups and accounts sharing and promoting QAnon material.
Update on October 6, 2020 at 2:00PM PT:
On August 19, we announced a set of measures designed to disrupt the ability of QAnon and Militarized Social Movements to operate and organize on our platform. In the first month, we removed over 1,500 Pages and Groups for QAnon containing discussions of potential violence and over 6,500 Pages and Groups tied to more than 300 Militarized Social Movements. But we believe these efforts need to be strengthened when addressing QAnon.
Starting today, we will remove any Facebook Pages, Groups and Instagram accounts representing QAnon, even if they contain no violent content. This is an update from the initial policy in August that removed Pages, Groups and Instagram accounts associated with QAnon when they discussed potential violence while imposing a series of restrictions to limit the reach of other Pages, Groups and Instagram accounts associated with the movement. Pages, Groups and Instagram accounts that represent an identified Militarized Social Movement are already prohibited. And we will continue to disable the profiles of admins who manage Pages and Groups removed for violating this policy, as we began doing in August.
We are starting to enforce this updated policy today and are removing content accordingly, but this work will take time and need to continue in the coming days and weeks. Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports. These are specialists who study and respond to new evolutions in violating content from this movement and their internal detection has provided better leads in identifying new evolutions in violating content than sifting through user reports.
We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update. For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public. Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.
This is not the first update to this policy – we began directing people to credible child safety resources when they search for certain child safety hashtags last week – and we continue to work with external experts to address QAnon supporters using the issue of child safety to recruit and organize. We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary.