Extensive Report Reveals TikTok Has Major Child Pornography Problem
Read Time: 1 Minute 47 Seconds
TikTok has a major problem with child pornography.
An extensive report from Forbes chronicles a terrifying reality: child pornography—legally known as child sexual abuse material—is easy to come by on TikTok, the short-form video sharing app with one billion monthly active users, making it the sixth most popular social media platform in the world.
To most, Forbes writer Alexandra Levine reported, the posts tied to the criminal handles “typically read like advertisements and come from seemingly innocuous accounts.”
“But often,” she continued, “they’re portals to illegal child sexual abuse material quite literally hidden in plain sight—posted in private accounts using a setting that makes it visible only to the person logged in.”
The CSAM-filled account holders purportedly share illicit content using “post-in-private” settings, meaning the one accessing the photos and videos has to have the account’s login information or use specified phrases, bypassing algorithms that might otherwise result in violations of the app’s terms of use.
Seara Adair, a survivor of child sexual abuse and an advocate for children’s safety, told Forbes she has reached out to TikTok employees, but to no avail. She has tried to alert them to this trend, explaining she believes users have discovered ways to bypass computer-operated and monitored algorithms by posting black-screen videos that only last a few seconds and contain brief instructions for predators.
“There’s quite literally accounts that are full of child abuse and exploitation material on their platform,” she told the outlet. “Not only does it happen on their platform, but quite often, it leads to other platforms—where it becomes even more dangerous.”
Adair said she has seen videos depicting “a child completely naked and doing indecent things.”
For her part, Levine corroborated Adair’s comments, reporting it was relatively simple to access “post-in-private” accounts without any hurdles, while others just required potential predators to contribute their own images before gaining access to the account information. Some account users were reportedly recruiting girls as young as 13 years old.
The issue is hardly unique to TikTok, according to Haley McNamara, director of the International Centre on Sexual Exploitation. She told Forbes all social media platforms are plagued with CSAM.
For the rest of this article, visit our content partners at cbnnews.com.
Reprinted with permission from cbn.com. Copyright © 2022 The Christian Broadcasting Network Inc. All rights reserved.
Bring Charisma magazine home with a subscription today!