Tech firms must enforce tougher age checks for social media platforms, regulator says
Tech firms have been told by the online safety regulator that major sites and apps must enforce minimum age rules following continued failures to keep under-13s safe online.
Ofcom wrote to the sites and apps most used by children, including Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube, requiring them to do more to prove to parents that they are protecting children online.
These platforms have until 30 April to say what they are doing, with Ofcom reporting on this in May.
The regulator is setting out four clear demands to hold tech firms publicly accountable for effective age checks and delivering the safest environment for children online in the UK.
Ofcom research shows that minimum age policies of 13 are still not being properly enforced, with 72% of children aged eight to 12 accessing these platforms.
Failsafe grooming protections also need to be enforced, ensuring strict measures to prevent strangers from being able to contact children they do not know on these sites.
Algorithms also need to be made safer for children, which the regulator notes is the main way children experience harm online. Ofcom will issue legally binding information requests to large platforms to assess this.
An end to product testing on children has also been called for, particularly AI tools, which are widely used by children without parents knowing if they have been tested for safety.
Ofcom expects platforms to notify it when they have, as required by law, assessed the risk in updates prior to release.
Dame Melanie Dawes, Ofcom chief executive, said: “These online services are household names, but they’re failing to put children’s safety at the heart of their products.
“There is a gap between what tech companies promise in private and what they’re doing publicly to keep children safe on their platforms.
“Without the right protections, like effective age checks, children have been routinely exposed to risks they didn’t choose, on services they can’t realistically avoid.
“That must now change quickly, or Ofcom will act.”
If the regulator is not satisfied with the platform’s responses, it has said it will take enforcement action and, if necessary, consider strengthening the regulatory requirements under its codes to ensure further action is taken.
Ofcom has been investigating nearly 100 services since the UK’s online safety laws came into force last year.
This has resulted in changes to disrupt and prevent the sharing of child sexual abuse material and has seen high-risk services blocked.
Platforms including X, Telegram, Discord and Reddit have now also introduced age controls, along with porn sites introducing age checks.
The regulator has acknowledged that some progress has been made, but states that the industry has not gone far enough, with lasting impacts including a loss of trust from parents in tech firms’ ability to keep their children safe.
This has prompted the Government to consult on further legislative changes to address public concern.
Molly Russell charity CEO: Social media’s user safety efforts have been ‘performative’
