In the space of just three months, YouTube removed a staggering 11.4 million videos for violating its Community Guidelines, the company revealed on Tuesday, August 25.
The content, which may have contained subject matter such as pornography, incitement to violence, harassment, and hate speech, was taken down between April and June 2020. The removed videos may also have included scams, spam, and misleading content.
The figure represents the highest number of videos that YouTube has removed from its platform in any three-month period since the service launched in 2005. The Google-owned company attributed the uptick to the greater use of automated detection technology after pandemic-related disruptions caused a reduction in the number of human reviewers.
Of the 11.4 million removed videos, 10.8 million were removed via automated flagging, with the rest flagged by humans.
Notably, YouTube said that the greater use of technology meant that some of the removed content will have been taken down in error.
“In response to COVID-19, we’ve taken steps to protect our extended workforce and reduce in-office staffing,” YouTube explained in its latest Transparency Report. “As a result, we are temporarily relying more on technology to help with some of the work normally done by human reviewers, which means we are removing more content that may not be violative of our policies.”
In a separate blog post, it elaborated: “When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement. One option was to dial back our technology and limit our enforcement to only what could be handled with our diminished review capacity.
“This would maintain a high level of accuracy, but would result in less content being removed from YouTube, including some content that violates our policies. The other option was to use our automated systems to cast a wider net so that the most content that could potentially harm the community would be quickly removed from YouTube, with the knowledge that many videos would not receive a human review, and some of the videos that do not violate our policies would be removed.
“Because responsibility is our top priority, we chose the latter — using technology to help with some of the work normally done by reviewers.”
In an effort to minimize disruption for its creators, YouTube said it decided, for now, to stop issuing strikes to content removed without human review, “except in cases where we have very high confidence that it violates our policies.”
Also, as it knew it would be receiving more appeals from disgruntled YouTubers whose videos had been removed in error, the company said it devoted more resources to its appeals procedure, enabling it to speed up the process of restoring content to the site where appropriate.
YouTube said that it’s continuing to work on improving the accuracy of its systems, while at the same time redeploying human reviewers “to the highest impact areas” as they gradually return to work.