Skip to main content

Logan Paul’s graphic YouTube video may have cleared initial human reviewers

So Sorry.
Users are often quick to point fingers at artificial intelligence when graphic content slips through social media filters but a recent video of an apparent suicide victim may have slipped by more than the software filters. YouTuber Logan Paul, a 22-year-old with 15 million subscribers, apologized twice after posting a video of an apparent suicide victim earlier this week.
Recommended Videos

The video, which showed a corpse hanging from a tree in Aokigahara, nicknamed Japan’s “suicide forest,” was removed within 24 hours by Paul, but not before the video made YouTube’s trending section. Paul’s followers reportedly include many users under the age of 18, with previous vlogs covering topics from Pokémon to stunts like “surfing” on a Christmas tree pulled behind a car.

The video immediately drew criticism, but now a Twitter user working as a YouTube trusted flagger claims the video was flagged by users, but that review staff approved the video after a manual review. YouTube confirmed that the video was against the platform’s policies for graphic content but did not comment on whether or not the video passed an initial manual review.

YouTube confirmed that Paul received a strike against his channel for the incident. With the strike system, one strike is considered a warning, while two prevents users from posting for two weeks and a third will terminate the account. Strikes expire after three months.

“Our hearts go out to the family of the person featured in the video,” a YouTube spokesperson said in a statement to Digital Trends. “YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases, it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.

While YouTube prohibits graphic content, in some cases, such as for education or a documentary — content is approved but age-restricted. For example, a historical clip of military conflict may be graphic but could be approved by YouTube’s manual review team as educational.

Paul said that the video was raw and unfiltered but that they should have put the cameras down and never posted the video. “I didn’t do it for views. I get views. I did it because I thought I could make a positive ripple on the internet, not cause a monsoon of negativity,” he wrote in an apology on Twitter. “That’s never the intention. In intended to raise awareness for suicide prevention and while I thought, ‘if this video saves just one life, it’ll be worth it,’ I was misguided by shock and awe, as portrayed in the video.”

The video comes a month after YouTube released an official statement on efforts the platform is taking to curb abuse, including adding more review staff and training the artificial intelligence algorithms to recognize more types of restricted content, including hate speech. At the time, the company said that the software led to review staff removing five times more videos that fell under the “violent extremist” category.

While the software algorithms are often blamed for slips, if the video did indeed pass a review by a staff member, the incident continues to support the idea that human reviewers are liable to make mistakes too. The incident comes a few days after ProPublica reported that review staff at Facebook were inconsistent about which posts flagged for hate speech were removed and which ones were left alone.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
YouTube to overhaul channel names with @ handles for all
Youtube video on mobile. Credits: YouTube official.

YouTube is launching “handles” to make it easier for viewers to find and engage with creators on the video-sharing platform.

The change means that soon, every channel will have a unique handle denoted by an "@" mark, "making it easier for fans to discover content and interact with creators they love," the Google-owned company said in a post announcing the change.

Read more
Searches for health topics on YouTube now highlights personal stories
The red and white YouTube logo on a phone screen. The phone is on a white background.

Google and TikTok aren't the only places people look for information on health issues. YouTube is another resource people look to for educating themselves on health-related topics. Now, YouTube has launched a new feature in an attempt to further support those queries in a different way.

On Wednesday, the video-sharing website announced its latest feature via a blog post. Known as a Personal Stories shelf, the new search-related feature will yield a "shelf" of personal story videos about the health topics users search for. Essentially, if you search for a health topic, a Personal Stories shelf may appear in your search results and it will be populated with YouTube videos that feature personal stories about people who have experienced the health issue you searched for.

Read more
This beloved TikTok feature is coming to YouTube Shorts
Two mobile devices showing two people dancing in YouTube Shorts videos.

YouTube Shorts, the video-sharing website's answer to TikTok videos, is getting a new comment reply feature and with it, looks more like its wildly popular competitor.

On Thursday, the new feature was announced via an update to a YouTube Help thread titled "New Features and Updates for Shorts Viewers & Creators." The announcement was posted by a TeamYouTube community manager.

Read more