Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Federal investigation into child sexual abuse targets TikTok

The U.S. Department of Homeland Security has reportedly launched an investigation into TikTok over how the platform handles content depicting child sexual abuse and the moderation controls put in place. The agency is looking into the alleged exploitation of a feature called “Only Me” on TikTok that was allegedly abused to share problematic content, something Financial Times claims to have verified in partnership with child safety groups and law enforcement officials.

The Only Me feature lets users save their TikTok videos without posting them online. Once a video’s status has been designated as Only Me, it can only be seen by the account’s owner. In TikTok’s case, credentials of accounts that shared content depicting Child Sexual Abuse Imagery (CSAM) were passed on among bad actors. In doing so, the abusive videos never made it to the public domain and avoided detection by TikTok’s moderation system.

TikTok app home page.
Image used with permission by copyright holder

TikTok is no stranger to the problem

This is not the first instance of such a serious probe into TikTok. The number of investigations by the Department of Homeland Security covering the spread of child exploitation content on TikTok has reportedly shot up by seven times between 2019 and 2021. And despite making bold promises regarding strict policy enforcement and punitive action against abusive content depicting children, it appears that bad actors are still thriving on the platform.

Recommended Videos

“TikTok talks constantly about the success of their artificial intelligence, but a clearly naked child is slipping through it,” child safety activist Seara Adair was quoted as saying. Interestingly, the federal agency banned TikTok on all systems, including phones and computers owned by the department’s information technology systems, in March this year over data security concerns.

Please enable Javascript to view this content

This also isn’t the first instance of TikTok hogging attention for the wrong reasons. Last month, a couple of former TikTok content moderators filed a lawsuit against the company, accusing it of not providing adequate support while they handled extreme content depicting “child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.”

A BCC investigation from 2019 revealed predators targeting children as young as nine years of age with sleazy comments and proposals. Elizabeth Denham, the U.K.’s information commissioner, launched a probe into TikTok the same year over the platform’s handling of personal data belonging to underage users. And given its immense popularity among young users, the option of deleting it is not really as straightforward as Facebook’s.

The risks are increasingly high, with media regulator Ofcom claiming that 16% of toddlers in the age group of three to four years consume TikTok content. As per the U.K.’s National Society for the Prevention of Cruelty to Children (NSPCC), online grooming crimes reached a record high in 2021, with children being at particularly high risk. Even though Instagram and Snapchat are the preferred platforms for predators, reports of horrific child grooming on TikTok have surfaced online on multiple occasions in the past few years.

TikTok has lately enforced measures to keep its young user base safe. Last year, TikTok announced that strangers will no longer be able to contact TikTok accounts belonging to children below 16 years of age, and their accounts will default to private. The short video haring platform even tightened the restrictions around downloading videos posted by users under the age of 18. TikTok also added resources to its platform to help sexual assault survivors last year, bringing in experts from the Rape, Abuse & Incest National Network (RAINN) and providing quick access to the National Sexual Assault Hotline.

Nadeem Sarwar
Nadeem is a tech journalist who started reading about cool smartphone tech out of curiosity and soon started writing…
TikTok faces outright ban in first U.S. state
TikTok icon illustration.

TikTok received more bad news on Wednesday after Montana Governor Greg Gianforte (R) signed into law a bill banning the popular app from January 1, 2024.

While more than half of U.S. states have already issued TikTok bans on government-issued devices, Montana’s action against the Chinese-owned app is significant as it’s the first state to impose a total ban on the app.

Read more
Former ByteDance exec claims China had access to TikTok data
TikTok logo on an iPhone.

TikTok is feeling the heat again after a former leading executive at its parent company, Byte Dance, made a series of damning claims in a wrongful dismissal lawsuit filed recently in the San Francisco Superior Court

Among the allegations made by Yintao Yu was that the Chinese Community Party (CCP) “maintained supreme access” to TikTok data stored in the U.S. when he worked for the company between 2017 and 2018.

Read more
Is TikTok getting banned? Here’s every country that’s blocked the app
TikTok logo on an iPhone.

TikTok has been making headlines as of late, but not for reasons pertaining to the content on the app. Instead, several governments across the globe have been looking into the app's origins and even calling for bans in some cases. Currently, TikTok is in something of a state of limbo in a lot of regions as different governments work on creating litigation and inspecting its roots. Recently, the U.S. held a congressional hearing with TikTok CEO Shou Zi Chew over privacy and security concerns — causing many citizens to wonder if the app will be banned as many representatives are calling for.

Here's what you need to know about every country in the world that's banned the app, introduced restrictions for it, or is currently considering one of the two.

Read more