Tech firms face new Ofcom measures to stop harmful content going viral
Tech companies could face new demands to block illegal content and introduce more restrictive livestreaming services as Ofcom looks to stay in line with “evolving” online harms.
The communications watchdog has launched a consultation seeking views on a raft of new measures that aims to keep UK users, especially children, safe from online harms.
Oliver Griffiths, Ofcom’s online safety group director, said: “Important online safety rules are already in force and change is happening. We’re holding platforms to account and launching swift enforcement action where we have concerns.
“But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online. So today we’re putting forward proposals for more protections that we want to see tech firms roll out.”
The new proposals focus on stopping illegal content from going viral, ensuring platforms are safer by design and strengthening the protection of children during livestreams.
They would require platforms to have protocols in place to respond to spikes in illegal content during a crisis and prevent their recommender systems from spreading material that might be illegal.
Sites would also have to introduce hash matching – a digital fingerprint that can flag a post’s link to prior illicit material – to identify terrorism content and intimate images that are shared without con-sent, such as explicit deepfakes.
For instance, when tackling child sexual abuse material (CSAM), hash matching can identify exact duplicates of already reported content before they are re-uploaded to a platform.
The new measures would also require platforms to assess the role automated tools can play in detecting harmful posts, including CSAM, content promoting suicide and self-harm and fraudulent material.
Livestreams would have to be under continuous review from human moderators, and users would be banned from commenting, reacting, or sending gifts to children’s livestreams as well as from recording them.
The consultation is set to close on 20 October, with a final decision set to be published by summer 2026. However, the measures have already come under fire from campaigners for not going far enough.
Reacting to the measures, the Molly Rose Foundation, an organisation set up in memory of 14-year-old Molly Russell, who took her own life after viewing self-harm content online, said on a post on X: “Ofcom's new measures will not address the current levels of harm or major new suicide and self-harm threats.
“It's time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head on.”
The proposals come as online platforms have less than a month to comply with Ofcom’s child safety measures, set to come into force on 24 July.
Holyrood Newsletters
Holyrood provides comprehensive coverage of Scottish politics, offering award-winning reporting and analysis: Subscribe