
Ofcom has published the final version of a new set of online safety rules designed to protect children which will come into force in July.
The regulator has set out more than 40 practical measures for tech firms to meet to comply with their duties under the Online Safety Act.
Here is a closer look at the measures and what they mean.
– What has Ofcom announced?
The online regulator has published its final codes of practice around protecting children for websites and apps.
The codes set out how platforms should design and operate their services in the UK so they are safe for children.
The Online Safety Act overall is made up of several codes of practice, each one focusing on a different policy area, which are being announced and rolled out by Ofcom.
Thursday’s announcement focuses on the codes around protecting children from harmful content.
– What are new measures?
Ofcom says there are more than 40 practical measures for firms to follow.
They include requiring firms to ensure that any algorithms used to recommend content on their platforms must be configured to filter out harmful content from children’s feeds.
In addition, the “riskiest” platforms – such as those hosting pornography – must have “effective age checks” to identify which users are children.
The checks could be done using facial age estimation technology, asking users to provide photo-ID for verification or a credit card check.
Other measures include requirements that sites have processes to review, assess and quickly remove harmful content when they become aware of it, and to provide younger users with clear controls to tailor their online experience, for example the ability to filter out content they do not want to see, to block or mute other accounts and disable comments on their own posts.
The codes require all platforms to have a named person accountable for children’s safety, and a senior body should carry out an annual review of their management of risk to children.
– What has been the response to the proposals?
They have been welcomed by many but some campaigners and charities argue that the Online Safety Act and Ofcom’s codes do not go far enough to protect children.
Senior figures at the Molly Rose Foundation – set up by the friends and family of Molly Russell, who killed herself aged 14 after viewing suicide and self-harm content on social media – have said the codes leave tech firms too much wiggle room to self-regulate.
Andy Burrows, the charity’s chief executive, said on Thursday that under the rules, it is still up to sites to decide what constitutes harmful material, so such content could still reach children.
“Because of how Ofcom has developed its proposals, it’s giving far too much weight to industry, rather than focusing on how it builds measures or how it sets objectives that can actually tackle the problem,” he told Sky News.
Ian Russell, Molly’s father and chairman of the foundation, said he was “dismayed” by the “overly cautious codes”.
“Instead of moving fast to fix things, the painful reality is that Ofcom’s measures will fail to prevent more young deaths like my daughter Molly’s,” he said.
“Ofcom’s risk averse approach is a bitter pill for bereaved parents to swallow. Their overly cautious codes put the bottom line of reckless tech companies ahead of tackling preventable harm.
“We lose at least one young life to tech-related suicide every single week in the UK, which is why today’s sticking plaster approach cannot be allowed to stand.
“A speedy remedy is within reach if the Prime Minister personally intervenes to fix this broken system. Less than one in 10 parents think Ofcom is doing enough and Sir Keir Starmer must commit without delay to strengthen online safety legislation.”
– What happens next?
Platforms which are likely to be accessed by children have until July 24 to carry out an assessment of the risk their services may pose to children, which Ofcom can request.
On July 25, platforms must apply the safety measures in the codes of practice.
Those that fail to comply could face fines of up to £18 million or 10% of global revenue – whichever is greater – meaning potentially billions of pounds for the largest firms.
The regulator also has the power in very serious cases to apply for a court order blocking access to a site or app in the UK.