New code makes global tech firms responsible for protecting underage users in UK

Robust age checks and better content moderation top list of 40 measures in draft codes of practice.

Proposals set out in a draft online safety code will require “a step-change from tech firms in how UK children are protected online”, according to UK communications regulator Ofcom.

The draft Children’s Safety Codes of Practice sets out more than 40 practical measures which technology companies will be required to take to protect underage users from harmful content including pornography, suicide and self-harm material. Ofcom chief executive Melanie Dawes stressed the measures “go way beyond current industry standards” and “firmly place the responsibility for keeping children safer on tech firms.”

The codes are part of the Online Safety Act, which was passed into legislation in October 2023. The Act applies not only to UK service providers, but to any company based outside the UK that provides a regulated service that targets the UK or which has a significant number of users in the UK. (Law firm Lewis Silkin provides a useful briefing on the scope and requirements of the Act).

Heavy fines

Failure to comply could lead to fines of up to £18m ($22m) or 10% of global annual turnover, whichever is higher and the blocking of non-compliant services. Senior managers who are judged to have failed to ensure compliance or to have hindered attempts to enforce compliance could face criminal sanctions.

Among the measures set out in the draft Codes are expectations that tech companies;

  • put robust age checks in place to ensure children cannot access harmful content. This could mean restricting access to parts of or entire sites;
  • ensure algorithms recommending content filter out harmful material;
  • introduce better content moderation with clear and accessible methods of reporting and dealing with harmful material;
  • ensure strong governance and accountability for children’s safety with tech firms.

Ofcom says it has spoken to over 15,000 young people about their lives online over the last 12 months, and with 7,000 parents. Consultation on the draft Codes will run until July 17 this year, with the final Codes published within a year. Once the Codes come into force formally, tech companies will have three months to conduct risk assessments of services to children.

Technology Secretary Michelle Donelan said: “The government assigned Ofcom to deliver the Act and today the regulator has been clear; platforms must introduce the kinds of age-checks young people experience in the real world and address algorithms which too readily mean they come across harmful material online. Once in place these measures will bring in a fundamental change in how children in the UK experience the online world.”

And Sir Peter Wanless, CEO of children’s charity the NSPCC, said: “Tech companies will be legally required to make sure their platforms are fundamentally safe by design for children when the final code comes into effect, and we urge them to get ahead of the curve now and take immediate action to prevent inappropriate and harmful content from being shared with children and young people.”