The Popular Video Platform Allegedly Directs Children's Profiles to Explicit Material In Just a Few Taps

As reported by a fresh inquiry, the widely-used social media app has been observed to guide profiles of minors to explicit material within a small number of clicks.

How the Study Was Conducted

Global Witness created fake accounts using a birthdate of a 13-year-old and turned on the app's "restricted mode", which is intended to limit exposure to adult-oriented content.

Researchers found that TikTok recommended sexually charged search terms to the simulated accounts that were set up on new devices with no previous activity.

Alarming Recommendation Features

The terms suggested under the "you may like" feature contained "extremely revealing clothing" and "very rude babes" – and then progressed to terms such as "explicit adult videos".

In three cases of the accounts, the sexualized searches were suggested immediately.

Quick Path to Pornography

Following just a few taps, the investigators found adult videos including revealing content to penetrative sex.

The research group reported that the content attempted to evade moderation, often by showing the clip within an benign visual or video.

Regarding one profile, the method took two interactions after accessing the app: one tap on the search bar and then another on the suggested search.

Legal Framework

The climate organization, whose remit includes researching digital platforms' effect on public safety, reported performing multiple testing phases.

The first group occurred prior to the activation of safeguarding regulations under the British online safety legislation on the 25th of July, and additional tests after the measures took effect.

Alarming Results

Researchers noted that two videos showed someone who looked like they were under 16 years old and had been submitted to the online safety group, which monitors harmful material involving minors.

The campaign group asserted that TikTok was in violation of the Online Safety Act, which mandates digital platforms to stop children from viewing inappropriate videos such as explicit content.

Official Reaction

A spokesperson for Britain's media watchdog, which is tasked with monitoring the act, said: "We acknowledge the research behind this investigation and will examine its results."

Ofcom's codes for complying with the act indicate that online services that carry a medium or high risk of displaying dangerous material must "adjust their systems to remove harmful content from children's feeds.

The app's policies prohibit pornographic content.

Platform Response

TikTok announced that following notification from the research group, it had taken down the problematic material and implemented adjustments to its recommendation system.

"Immediately after notification" of these allegations, we responded quickly to look into the matter, remove content that violated our policies, and implement enhancements to our search prompt functionality," said a spokesperson.

James Pearson
James Pearson

A passionate designer and writer sharing insights on home decor and sustainable living.