Choose Country and Language

TikTok is under Investigation over spread of Child Sexual Abuse Material

tiktok is under investigation over spread of child sexual abuse material
Solen Feyissa/Unsplash

The Department of Homeland Security is looking into the spread of child sexual abuse material on TikTok, sources told The Financial Times.

The platform is under a separate investigation for allegedly allowing CSAM videos to be uploaded to the public feed, while bad actors on TikTok are also using a privacy feature to share CSAM with others.

DHS' probe comes after a child privacy researcher reported to TikTok that CSAM was being posted on the platform. The Financial Times found that TikTok moderators had a hard time keeping up with the number of videos being uploaded publicly. Investigators also found that abusers have used the platform for grooming, which involves befriending a child online and later abusing them. A source told the FT that the U.S. Justice Department is also investigating how predators are misusing a TikTok feature called "Only Me."

The Justice Department launched its investigation after child safety groups and law enforcement found that CSAM content was being passed through private accounts. Users would include keywords in their public videos, usernames and bios, and illegal content was uploaded through the private "Only Me" feed, which shows videos only for those who are logged in. Predators would share their passwords with victims and other predators to access the "Only Me" videos.

TikTok did not immediately return Protocol's request for comment. But the platform said in a statement that it has worked with law enforcement on the issue and removed accounts and content that includes CSAM.

"TikTok has zero-tolerance for child sexual abuse material," the company said. "When we find any attempt to post, obtain or distribute CSAM, we remove content, ban accounts and devices, immediately report to [The National Center for Missing & Exploited Children], and engage with law enforcement as necessary."

Platforms and lawmakers have taken their own steps to address CSAM. Apple tried rolling out new child-protection features, but the company later delayed those plans after privacy advocates argued that they would do more harm than good. A couple months ago, lawmakers reintroduced the Earn It Act, which would take away tech companies' Section 230 immunity regarding CSAM at the state and federal level. At the same time, a recent report found that despite tech's plan to fight CSAM, the issue is getting bigger and more difficult to handle.

TikTok told Protocol that it's not aware of government investigations the FT reported, but the platform has a zero tolerance policy on CSAM. "Upon reading this story, we reached out to HSI to begin a dialogue and discuss opportunities to work together on our shared mission of ending child sexual exploitation online — just as we regularly engage with law enforcement agencies across the country on this crucial topic," a spokesperson said.

Protocol contacted DHS for more details about the investigation.

Authored by Sarah Roach via Protocol April 19th 2022

Stop Child Abuse

There are many ways you can get involved and make a difference to prevent child abuse. Take action and choose what works best for you.