Skip to main content

YouTube is trying to crack down on QAnon videos. It’s not working

The far-right group QAnon has caught the attention of the public in recent weeks for its ability to maneuver social media platforms to spread misinformation and create viral attention for its conspiracy theories.

Recommended Videos

Social media platforms like Twitter and TikTok have taken action against the group after long-disproved conspiracy theories like #PizzaGate gained new attention, banning accounts and disabling its popular hashtags from showing up in search. Although both platforms have proved to be fertile ground for the group, QAnon has flourished on YouTube for years — despite the company’s investments in moderating it.

Now, the video platform is trying to stop QAnon videos from appearing in users’ recommendations, but a Digital Trends test found that the conspiracy theory videos still appear in prominent places on the YouTube homepage.

QAnon supporter with Q flag
Scott Olson / Getty Images

QAnon supporters believe in a number of disproven conspiracy theories that originated on 4chan and 8chan imageboards in late 2016. The group believes that a figure named “Q,” who claims to be part of the Trump Administration, is posting dispatches about President Donald Trump’s war against a “deep state” plot. The group has seen major growth on YouTube since 2016, since the engagement-driven algorithm promotes QAnon videos.

“QAnon initially grabbed traction on YouTube,” said Will Partin, a research analyst at Data & Society Research Institute, who has been studying the group since 2017. “It was picked up by YouTubers who really gave the conspiracy legs, and as a result, really spread it, and did the work of connecting it to other stuff that was already in the air” among conservatives.

“You can’t stop someone from actively seeking it out”

Thanks to the nature of YouTube itself, the platform proved to be the premier place where long-form, pseudo documentaries and talk shows deciphering clues or “breadcrumbs” dropped by “Q” thrived. Those interested in QAnon theories could easily find other users recommending additional videos in the comment section or through YouTube’s recommendation algorithms — a feature that has come under fire from critics as an easy way to “radicalize” viewers.

But now, if you type “QAnon” into the search bar of YouTube, your screen will “prominently surface authoritative sources” like news organizations and experts — part of YouTube’s plan to raise up factual content to combat misinformation. YouTube has also started to feature text boxes and information panels linking to third-party sources. Digital Trends found that YouTube provides a link to the Wikipedia article about QAnon under related videos.

youtube
Image used with permission by copyright holder

YouTube said since it implemented new content moderation policies in January, it has seen a 70% reduction in the number of views QAnon content gets from video recommendations. Since YouTube is a video platform site, not a social networking site, moderation is on a video-to-video basis, not by account. And according to a YouTube spokesperson, the platform’s systems have been adjusted to lower the ranking on QAnon content in the recommendations sidebar, even if you’re watching similar videos.

However, Digital Trends found that after watching half a dozen QAnon-related videos on the platform, YouTube featured at least three videos with QAnon-related conspiracies under the “All Recommendations” tab on the homepage. One video that was recommended mentioned the conspiracy surrounding the chemical compound Adrenochrome — widely circulated within QAnon groups as being obtained by elite Hollywood figures through the killing of children — and had over 1 million views.

A representative from YouTube declined to comment for this story.

According to Partin, the way QAnon content is viewed across YouTube is not based on the videos recommended to users, but the content of the QAnon-related videos themselves. QAnon influencers will often shout out another in a video, or direct viewers to other pages, regardless of how YouTube moderates QAnon tags or videos. However, Partin said there are some QAnon tags that are so obscure that it is impossible for YouTube to fill search results with factual content from reputable sources.

“At that point, you can’t hide it algorithmically, you can’t stop someone from actively seeking it out,” he said.

Partin said YouTube was once “the key spot” for a user to be introduced to QAnon-related content, but now most of the recruiting happens on Facebook — where private QAnon groups can swell into the hundreds of thousands and are especially difficult to moderate.

Facebook is reportedly planning to make a similar move to moderate the group, according to The New York Times, following in the footsteps of Twitter and TikTok.

“When platforms move to ban QAnon, they often do it all at the same time to have strength in numbers,” said Partin. “But it is really hard to build a policy about QAnon because a lot of it is just regular conservative electorate stuff.”

Partin said the one thing he hopes people understand about the dangers of QAnon-related content on social media is not how widely or how deeply its conspiracies are believed, but the risk in hitting the “share” button itself.

“It is impossible to know someone’s intentions, but the intention doesn’t matter that much,” he said. “I don’t really care if they do or don’t believe this, they act like they do, and they are spreading this content and this credence, regardless of the intention behind it.”

Meira Gebel
Former Digital Trends Contributor
Meira Gebel is a freelance reporter based in Portland. She writes about tech, social media, and internet culture for Digital…
YouTube to overhaul channel names with @ handles for all
Youtube video on mobile. Credits: YouTube official.

YouTube is launching “handles” to make it easier for viewers to find and engage with creators on the video-sharing platform.

The change means that soon, every channel will have a unique handle denoted by an "@" mark, "making it easier for fans to discover content and interact with creators they love," the Google-owned company said in a post announcing the change.

Read more
Searches for health topics on YouTube now highlights personal stories
The red and white YouTube logo on a phone screen. The phone is on a white background.

Google and TikTok aren't the only places people look for information on health issues. YouTube is another resource people look to for educating themselves on health-related topics. Now, YouTube has launched a new feature in an attempt to further support those queries in a different way.

On Wednesday, the video-sharing website announced its latest feature via a blog post. Known as a Personal Stories shelf, the new search-related feature will yield a "shelf" of personal story videos about the health topics users search for. Essentially, if you search for a health topic, a Personal Stories shelf may appear in your search results and it will be populated with YouTube videos that feature personal stories about people who have experienced the health issue you searched for.

Read more
This beloved TikTok feature is coming to YouTube Shorts
Two mobile devices showing two people dancing in YouTube Shorts videos.

YouTube Shorts, the video-sharing website's answer to TikTok videos, is getting a new comment reply feature and with it, looks more like its wildly popular competitor.

On Thursday, the new feature was announced via an update to a YouTube Help thread titled "New Features and Updates for Shorts Viewers & Creators." The announcement was posted by a TeamYouTube community manager.

Read more