TikTok exposing children to porn is in breach of UK law, new report says

Researchers set up accounts pretending to be 13-year olds and set relevant filters, but were still able to see pornographic content due to TikTok’s algorithm. Legal experts believe this violates the UK’s Online Safety Act.

A new investigation by campaign group Global Witness has found that TikTok’s search algorithms are apparently exposing and directing children to sexually explicit content, in violation of UK law and the platform’s own community guidelines.

Researchers from the group created new accounts on the social media giant pretending to be

Free Trial

Register for free to keep reading.

To continue reading this article and unlock full access to GRIP, register now. You’ll enjoy free access to all content until our subscription service launches in early 2026.

  • Unlimited access to industry insights
  • Stay on top of key rules and regulatory changes with our Rules Navigator
  • Ad-free experience with no distractions
  • Regular podcasts from trusted external experts
  • Fresh compliance and regulatory content every day
Register for free Already a member? Sign in