From Monday, Ofcom will begin enforcing the Online Safety Act’s illegal content codes, requiring technology firms to take swift action against harmful material on their platforms. While the Government has hailed the move as a “major step forward,” critics argue that the media regulator’s approach is far too weak to effectively protect children and vulnerable users.
The new rules place a legal obligation on social media companies to detect and remove illegal content, including child sexual abuse material, terrorist content, and online fraud. However, campaigners and child protection advocates have expressed concerns that Ofcom’s implementation lacks urgency and ambition, failing to hold tech companies fully accountable.
Families and campaigners express disappointment
Among the most vocal critics is Ian Russell, chairman of the Molly Rose Foundation, who lost his 14-year-old daughter Molly in 2017 after she was exposed to harmful content online. He described Monday as a moment that “should have been a watershed,” but instead, he believes families have been “let down by Ofcom’s timidity and lack of ambition.”
Mr Russell accused the regulator of prioritising the interests of tech giants over child safety, stating:
“It is increasingly apparent that Ofcom’s timid approach has been dominated by their fear of legal challenge and their eagerness to placate the tech companies.”
He further called on the Government to take immediate action to strengthen the Online Safety Act, warning that any delay in closing life-threatening gaps in the legislation would be “unacceptable.”
His concerns were echoed by Andy Burrows, chief executive of the Molly Rose Foundation, who criticised Ofcom for not introducing a single measure specifically targeting suicide and self-harm content. He argued that UK children remain at “palpable yet wholly preventable risk” due to a lack of decisive intervention.
Government and Ofcom defend the new regulations
Despite the backlash, the Government insists that the enforcement of the Online Safety Act represents a landmark shift in online regulation.
Technology Secretary Peter Kyle stated:
“For too long, child abuse material, terrorist content, and intimate image abuse have been easy to find online. Today, that changes. Tech companies now have a legal duty to prevent and remove it.”
He emphasised that this was only the beginning, pledging that the Government would act “decisively” whenever new online threats emerged.
Ofcom, for its part, defended its approach, asserting that it had “moved quickly and set out robust safety measures” to guide platforms in complying with their new legal obligations.
An Ofcom spokesperson warned that companies failing to meet their responsibilities would face severe consequences:
“Platforms must now act quickly to comply with their legal duties, and our codes are designed to help them do that. But make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action.”
Under the Online Safety Act, social media firms that fail to meet their obligations could face fines of up to £18 million or 10% of their global turnover, whichever is greater. In extreme cases, Ofcom could seek to have non-compliant sites blocked in the UK.
Experts question the strength of Ofcom’s measures
Despite these assurances, leading child protection groups remain unconvinced.
Chris Sherwood, chief executive of the NSPCC, expressed concern that Ofcom’s final codes of practice were not “strong enough” and contained an unacceptable loophole allowing platforms to avoid removing illegal content if it is deemed not “technically feasible” to do so.
“This lets tech platforms off the hook. Today marks an important step forward, but the Government and Ofcom must work together to significantly strengthen these codes to ensure meaningful change for children.”
His concerns were shared by Pepe Di’Iasio, general secretary of the Association of School and College Leaders (ASCL), who welcomed the progress but questioned whether it would truly be effective. He warned that social media continues to pose serious risks to young people, including exposure to bullying, harmful content, and poorly enforced age restrictions.
Fraud prevention measures criticised as insufficient
In addition to concerns over child safety, consumer rights advocates have raised alarms about gaps in fraud prevention measures under the new rules.
Rocio Concha, director of policy and advocacy at Which?, welcomed the requirement for platforms to tackle user-generated fraud but pointed out that the new rules do not cover fraudulent paid-for advertisements, a major source of online scams.
“Under the current timetable, firms in scope of the fraudulent advertising duties won’t be held accountable until 2027. That is simply not good enough and leaves consumers unnecessarily exposed to countless scam ads.”
Ms Concha urged the Government to fully implement the Online Safety Act without delay, warning that millions more people could fall victim to ruthless online fraudsters if action is not taken urgently.
Free speech concerns from abroad
The UK’s new online safety regulations have also attracted scrutiny from international figures.
Last month, US Vice President JD Vance suggested that the Online Safety Act was infringing on free speech and negatively impacting American technology firms.
However, Labour leader Sir Keir Starmer rejected these claims, defending the legislation as a necessary measure to protect the public from harm.
When asked about concerns over censorship, Sir Keir told Fox News:
“No, we don’t believe in censoring speech. But of course, we do need to deal with terrorism, paedophiles, and issues like that.”
Calls for urgent action grow
As Ofcom begins enforcing the Online Safety Act, campaigners, charities, and experts continue to demand tougher measures to close the loopholes they say leave children and vulnerable users at risk.
With growing pressure on the Government to act, it remains to be seen whether the current regulations will be strengthened—or if concerns over Big Tech influence and enforcement weaknesses will continue to overshadow the fight against harmful online content.