Skip to main content

Facebook removes nearly 800 QAnon-related groups, pages, hashtags, and ads

Facebook took down nearly 800 groups associated with the far-right conspiracy theory group QAnon on Wednesday, as well as more than 1,500 advertisements and 100 pages tied to the group in a move to restrict “violent acts.”

Recommended Videos

In a blog post, Facebook said the action is part of a broader “Dangerous Individuals and Organizations” policy measure to remove and restrict content that has led to real-world violence. The policy will also impact militia groups and political protest organizations like Antifa.

“While we will allow people to post content that supports these movements and groups, so long as they do not otherwise violate our content policies, we will restrict their ability to organize on our platform,” the company said.

QAnon supporters believe in a widely disproven “deep state” conspiracy that President Donald Trump is working to eradicate pedophilia and Satanism throughout Washington D.C. The conspiracy theorists have recently latched onto the COVID-19 public health crisis, calling it a “bioweapon.”

QAnon theories hit the mainstream after the controversy surrounding #Pizzagate, in which a man brought a gun to a pizzeria, claiming he would find victims of child abuse. The group has also been linked to dozens of other violent incidents that stem from baseless theories shared on private Facebook groups and message boards.

Facebook took action against QAnon earlier this month, when it yanked down an influential group with more than 200,000 members, but Wednesday’s move is perhaps the social media giant’s most substantial move yet.

The company said it will limit QAnon content from appearing in its recommendations tab, reduce its content in search results, and prohibit QAnon-related accounts and groups from monetizing content, selling merchandise, fundraising, and purchasing advertising on both Facebook and Instagram. The company plans to continue to investigate just how QAnon operates on its platform, by observing “specific terminology and symbolism used by supporters to identify the language used by these groups and movements indicating violence and take action accordingly.”

In recent months, other social media sites like Twitter and TikTok have banned and disabled popular QAnon hashtags and accounts for inauthentic, coordinated behavior and for spreading disinformation.

However, do not expect QAnon to disappear quietly: Experts have called QAnon members “really good at adapting” to online ecosystems, and several QAnon supporters have won primaries for public office on platforms that represent the conspiracy theories shared within the group.

Meira Gebel
Former Digital Trends Contributor
Meira Gebel is a freelance reporter based in Portland. She writes about tech, social media, and internet culture for Digital…
Facebook takes down viral ‘Plandemic’ coronavirus conspiracy video
fatal shooting facebook live app

Facebook removed the viral conspiracy video "Plandemic" from its platform Thursday for violating misinformation policies, the company told Digital Trends.

The 25-minute clip from a supposed upcoming documentary was posted on Facebook on Monday and had racked up 1.8 million views, including 17,000 comments and nearly 150,000 shares.

Read more
I paid Meta to ‘verify’ me — here’s what actually happened
An Instagram profile on an iPhone.

In the fall of 2023 I decided to do a little experiment in the height of the “blue check” hysteria. Twitter had shifted from verifying accounts based (more or less) on merit or importance and instead would let users pay for a blue checkmark. That obviously went (and still goes) badly. Meanwhile, Meta opened its own verification service earlier in the year, called Meta Verified.

Mostly aimed at “creators,” Meta Verified costs $15 a month and helps you “establish your account authenticity and help[s] your community know it’s the real us with a verified badge." It also gives you “proactive account protection” to help fight impersonation by (in part) requiring you to use two-factor authentication. You’ll also get direct account support “from a real person,” and exclusive features like stickers and stars.

Read more
Here’s how to delete your YouTube account on any device
How to delete your YouTube account

Wanting to get out of the YouTube business? If you want to delete your YouTube account, all you need to do is go to your YouTube Studio page, go to the Advanced Settings, and follow the section that will guide you to permanently delete your account. If you need help with these steps, or want to do so on a platform that isn't your computer, you can follow the steps below.

Note that the following steps will delete your YouTube channel, not your associated Google account.

Read more