YouTube is considering major changes to its recommendation algorithm amid an investigation by the Federal Trade Commission into how it handles videos aimed at children.
The investigation is in its late stages, and is in response to complaints made about the platform’s treatment of kids that date as far back as 2015, the Washington Post reported citing anonymous sources familiar with the matter. YouTube has been accused for failing to protect kids, particularly when its algorithm recommends or queues inappropriate videos. The FTC is also investigating whether YouTube improperly collected data from young viewers.
It’s not clear what the changes would entail, but the Wall Street Journal reported that leadership was considering moving all kids’ content to its own app or removing YouTube’s auto-play feature.
Earlier this year, YouTube videos aimed at children were accused of containing harmful and graphic content. There are also concerns that kids could watch YouTube videos that contain hate speech, conspiracy theories or misinformation.
Even with a parent’s watchful eye, the platform’s algorithm for trending and queuing videos is the main culprit for spreading harmful content to millions of viewers. Some videos explicitly target keywords used by kids — topics like Frozen or Marvel superheroes — in the hopes of generating views from younger audiences. When it comes to videos encouraging racism or other hate, YouTube announced earlier this month that it would update its video removal policy to specifically target hate speech and discriminatory content.
YouTube’s encourages viewers to report content that violates its current Child Safety Policy. In theory, the system should first warn the content producer, then issue a strike followed by terminating a user’s channel after three strikes. Law enforcement would only be notified if the content depicts a child in danger. Some videos fall through the cracks, however, and YouTube has had trouble keeping up with policing content amid the hours and hours of video uploaded every minute.
Even the kid-friendly app, YouTube Kids, contains a disclaimer: “We use a mix of filters, user feedback and human reviewers to keep the videos in YouTube Kids family friendly. But no system is perfect and inappropriate videos can slip through, so we’re constantly working to improve our safeguards and offer more features to help parents create the right experience for their families.”
YouTube did not respond to our request for comment about potential changes to the algorithm, but spokesperson Andrea Faville told the Post in a statement that not every option for product changes would make it to YouTube itself.
“We consider lots of ideas for improving YouTube and some remain just that — ideas,” she said. “Others, we develop and launch, like our restrictions on minors live-streaming or updated hate speech policy.”