RISE, PROTECT, EMPOWER: A CALL TO SAFEGUARD EVERY CHILD IN THE DIGITAL AGE
THE PRESS CENTER | THE MORSE CODE | TECHNOLOGY
OCT 3 2025
A recent investigation by the UK watchdog Global Witness has brought forward urgent findings about how TikTok’s search system may expose young users—specifically those registered as 13 years old—to sexually explicit material. The report emphasizes that even when teens use the platform’s “restricted mode,” the algorithm can still guide them toward inappropriate content. This discovery reinforces a powerful truth: your voice, your vigilance, and your commitment to digital safety truly matter.
Global Witness created seven test accounts in the UK, each designed to mimic a 13‑year‑old user on a freshly reset phone with no prior activity. Despite these controlled conditions, the accounts quickly received search suggestions that leaned toward sexual themes. In several instances, explicit suggestions appeared the very first time the search bar was tapped. Within only a few interactions, all seven accounts encountered pornographic material.
The organization stressed that the issue extends beyond accidental exposure. Their findings suggest that the platform’s search system may actively steer minors toward harmful content. This clarity empowers all of us to understand the stakes: children deserve online spaces that uplift, educate, and protect them—not environments that place them at risk.
These revelations arrive at a time when lawmakers in both the UK and the U.S. are pushing for stronger online safety regulations. TikTok is already facing legal challenges related to concerns about the platform’s impact on young users’ mental well‑being.
In response to the report, TikTok stated that it is committed to user safety and has begun investigating the findings. The company noted that it removed content that violated its policies, updated its search suggestion systems, and highlighted that it offers dozens of safety tools for teens. TikTok also reported that most harmful videos are taken down before anyone sees them. These steps are meaningful, yet the report raises questions about whether the platform is fully meeting the requirements of the UK’s Online Safety Act of 2023, which demands stronger age verification and more effective content moderation.
A media lawyer cited in the report expressed concern that the findings may indicate a breach of the law. TikTok, which has been under Ofcom regulation since 2020, stated that it is working to align with the updated rules and has added new safeguards, including AI‑based age detection and enhanced moderator training.
This situation places TikTok among several major platforms—such as YouTube and Instagram—that are under pressure to strengthen protections for young users. While many companies have introduced new tools like AI‑driven age estimation and default privacy settings for teens, experts continue to call for greater transparency, stronger enforcement, and more consistent accountability.
As conversations about online child safety grow louder, this report serves as a powerful reminder: we all can advocate for safer digital spaces. We can demand better. We can stay informed. We can protect the next generation.
And most importantly, we can move forward with confidence knowing that awareness is the first step toward meaningful change.
You are capable. You are informed. You are part of the solution.
SOURCE CREDIT: WWW.CNN.COM/BUSINESS/TECH