The far-right group QAnon has caught the attention of the public in recent weeks for its ability to maneuver social media platforms to spread misinformation and create viral attention for its conspiracy theories.
Social media platforms like Twitter and TikTok have taken action against the group after long-disproved conspiracy theories like #PizzaGate gained new attention, banning accounts and disabling its popular hashtags from showing up in search. Although both platforms have proved to be fertile ground for the group, QAnon has flourished on YouTube for years — despite the company’s investments in moderating it.
Now, the video platform is trying to stop QAnon videos from appearing in users’ recommendations, but a Digital Trends test found that the conspiracy theory videos still appear in prominent places on the YouTube homepage.
QAnon supporters believe in a number of disproven conspiracy theories that originated on 4chan and 8chan imageboards in late 2016. The group believes that a figure named “Q,” who claims to be part of the Trump Administration, is posting dispatches about President Donald Trump’s war against a “deep state” plot. The group has seen major growth on YouTube since 2016, since the engagement-driven algorithm promotes QAnon videos.
“QAnon initially grabbed traction on YouTube,” said Will Partin, a research analyst at Data & Society Research Institute, who has been studying the group since 2017. “It was picked up by YouTubers who really gave the conspiracy legs, and as a result, really spread it, and did the work of connecting it to other stuff that was already in the air” among conservatives.
“You can’t stop someone from actively seeking it out”
Thanks to the nature of YouTube itself, the platform proved to be the premier place where long-form, pseudo documentaries and talk shows deciphering clues or “breadcrumbs” dropped by “Q” thrived. Those interested in QAnon theories could easily find other users recommending additional videos in the comment section or through YouTube’s recommendation algorithms — a feature that has come under fire from critics as an easy way to “radicalize” viewers.
But now, if you type “QAnon” into the search bar of YouTube, your screen will “prominently surface authoritative sources” like news organizations and experts — part of YouTube’s plan to raise up factual content to combat misinformation. YouTube has also started to feature text boxes and information panels linking to third-party sources. Digital Trends found that YouTube provides a link to the Wikipedia article about QAnon under related videos.
YouTube said since it implemented new content moderation policies in January, it has seen a 70% reduction in the number of views QAnon content gets from video recommendations. Since YouTube is a video platform site, not a social networking site, moderation is on a video-to-video basis, not by account. And according to a YouTube spokesperson, the platform’s systems have been adjusted to lower the ranking on QAnon content in the recommendations sidebar, even if you’re watching similar videos.
However, Digital Trends found that after watching half a dozen QAnon-related videos on the platform, YouTube featured at least three videos with QAnon-related conspiracies under the “All Recommendations” tab on the homepage. One video that was recommended mentioned the conspiracy surrounding the chemical compound Adrenochrome — widely circulated within QAnon groups as being obtained by elite Hollywood figures through the killing of children — and had over 1 million views.
A representative from YouTube declined to comment for this story.
According to Partin, the way QAnon content is viewed across YouTube is not based on the videos recommended to users, but the content of the QAnon-related videos themselves. QAnon influencers will often shout out another in a video, or direct viewers to other pages, regardless of how YouTube moderates QAnon tags or videos. However, Partin said there are some QAnon tags that are so obscure that it is impossible for YouTube to fill search results with factual content from reputable sources.
“At that point, you can’t hide it algorithmically, you can’t stop someone from actively seeking it out,” he said.
Partin said YouTube was once “the key spot” for a user to be introduced to QAnon-related content, but now most of the recruiting happens on Facebook — where private QAnon groups can swell into the hundreds of thousands and are especially difficult to moderate.
Facebook is reportedly planning to make a similar move to moderate the group, according to The New York Times, following in the footsteps of Twitter and TikTok.
“When platforms move to ban QAnon, they often do it all at the same time to have strength in numbers,” said Partin. “But it is really hard to build a policy about QAnon because a lot of it is just regular conservative electorate stuff.”
Partin said the one thing he hopes people understand about the dangers of QAnon-related content on social media is not how widely or how deeply its conspiracies are believed, but the risk in hitting the “share” button itself.
“It is impossible to know someone’s intentions, but the intention doesn’t matter that much,” he said. “I don’t really care if they do or don’t believe this, they act like they do, and they are spreading this content and this credence, regardless of the intention behind it.”