The video, which showed a corpse hanging from a tree in Aokigahara, nicknamed Japan’s “suicide forest,” was removed within 24 hours by Paul, but not before the video made YouTube’s trending section. Paul’s followers reportedly include many users under the age of 18, with previous vlogs covering topics from Pokémon to stunts like “surfing” on a Christmas tree pulled behind a car.
The video immediately drew criticism, but now a Twitter user working as a YouTube trusted flagger claims the video was flagged by users, but that review staff approved the video after a manual review. YouTube confirmed that the video was against the platform’s policies for graphic content but did not comment on whether or not the video passed an initial manual review.
Logan Paul’s video was reported and YouTube manually reviewed it; they decided to leave it up without even an age restriction… people who have re-uploaded it since have received strikes for graphic content. Ridiculous. pic.twitter.com/Hj9lyiQwE2
— Ben (@TrustedFlagger) January 2, 2018
YouTube confirmed that Paul received a strike against his channel for the incident. With the strike system, one strike is considered a warning, while two prevents users from posting for two weeks and a third will terminate the account. Strikes expire after three months.
“Our hearts go out to the family of the person featured in the video,” a YouTube spokesperson said in a statement to Digital Trends. “YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases, it will be age-gated. We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.”
While YouTube prohibits graphic content, in some cases, such as for education or a documentary — content is approved but age-restricted. For example, a historical clip of military conflict may be graphic but could be approved by YouTube’s manual review team as educational.
Paul said that the video was raw and unfiltered but that they should have put the cameras down and never posted the video. “I didn’t do it for views. I get views. I did it because I thought I could make a positive ripple on the internet, not cause a monsoon of negativity,” he wrote in an apology on Twitter. “That’s never the intention. In intended to raise awareness for suicide prevention and while I thought, ‘if this video saves just one life, it’ll be worth it,’ I was misguided by shock and awe, as portrayed in the video.”
The video comes a month after YouTube released an official statement on efforts the platform is taking to curb abuse, including adding more review staff and training the artificial intelligence algorithms to recognize more types of restricted content, including hate speech. At the time, the company said that the software led to review staff removing five times more videos that fell under the “violent extremist” category.
While the software algorithms are often blamed for slips, if the video did indeed pass a review by a staff member, the incident continues to support the idea that human reviewers are liable to make mistakes too. The incident comes a few days after ProPublica reported that review staff at Facebook were inconsistent about which posts flagged for hate speech were removed and which ones were left alone.