The communications watchdog Ofcom has published the final version of its new code of practice aimed at preventing harm to children online.

Tech companies that recommend content to children must overhaul their algorithms to ensure that such content is appropriate.

Age checks should also be more robust to prevent children from accessing inappropriate content, especially on platforms that host pornography and content related to self-harm, eating disorders and other potentially harmful topics.

The rules are still subject to parliamentary approval under the Online Safety Act, but if they become law, Ofcom will be given the power to impose fines for non-compliance.

In the most serious cases, the regulator could even apply for a court order to prevent the website or platform from being available in the UK.

Ofcom said the new rules contain more than 40 practical steps that tech companies and online platforms must take.

These include:

  • Algorithms to filter out content for safer feeds
  • Effective age checks for inappropriate and age-restricted content
  • Fast action where harmful content is identified
  • Providing support to children who come across harmful content
  • Clear and easy options for children to block or mute accounts and disable comments
  • Naming a person in the organisation responsible for children’s safety.

Ofcom boss Dame Melanie Dawes called the new code a “gamechanger”, but some argued the rules did not go far enough.

The NSPCC called the new rules “a pivotal moment for children’s safety online” but called on Ofcom to go further, especially in tackling encrypted messaging apps.