YouTube Kids’ promise of “a world of discovery, learning and entertainment” took on a whole new meaning for many parents after they noticed sexual content, colorful expletives, and other dodgy content landing on the “family friendly” service.
The app, which launched on iOS and Android in February, has faced criticism from consumer groups for failing to filter out inappropriate material, with some taking their complaints to the Federal Trade Commission (FTC).
Addressing concerns, Google-owned YouTube on Thursday previewed a forthcoming update that it says will introduce an in-app explainer giving the lowdown on how the software filters content. It’ll also offer clearer guidelines on how to set up parental controls and show how to flag up any inappropriate videos that the filtering system fails to spot.
Parents will also be prompted to choose between enabling the Search function or turning it off, thereby limiting the littl’uns to a smaller hand-picked selection of content presumably guaranteed to be free of ads for alcohol and videos of Sesame Street characters turning the air blue.
Earlier this year, the Center for Digital Democracy and the Campaign for a Commercial-Free Childhood accused Google of deceiving parents “by marketing YouTube Kids as a safe place for children under five to explore, when, in reality, the app is rife with videos that would not meet anyone’s definition of ‘family friendly’.”
The Mountain View company is making efforts to reassure parents by improving the app, but as it states in the software’s notes, “No algorithm is perfect, and even a perfect algorithm is no substitute for a parent or guardian’s judgment….if your child finds a video that you feel is inappropriate, please flag the video and it will be reviewed as soon as possible.”
Google says the update, which also brings with it Chromecast, Apple TV and smart TV support, is set to roll out in the “coming weeks.”