The Molly Rose Foundation has raised alarms about the risks posed by Meta’s recent changes to its content moderation policies, warning that they could turn social media into a haven for harmful content and put children at greater risk.
The charity, set up in memory of Molly Russell, who tragically took her life at 14 after viewing harmful content on platforms such as Instagram, has condemned what it describes as a “bonfire of safety measures” by Meta. It suggests that the company’s decision to scale back its safety policies risks rolling back progress made since Molly’s death.
Meta, which owns Facebook and Instagram, has announced significant changes to its content moderation practices in the name of promoting “free expression.” Among these changes is the cessation of automated scanning for some types of harmful content, with the company instead relying on user reports for content removal.
The decision has raised concern among campaigners, including the Molly Rose Foundation, which is calling on Ofcom, the UK’s new online safety regulator, to strengthen its requirements on tech companies to better protect teenagers. The Foundation’s Chief Executive, Andy Burrows, has warned that the new policies could take social media back to the state it was in when Molly died.
“Meta’s bonfire of safety measures is hugely concerning. Mark Zuckerberg’s increasingly cavalier approach is taking us back to what social media looked like at the time that Molly died,” said Burrows. “Ofcom must send a clear signal that it is willing to act in the best interests of children and urgently strengthen its requirements on tech platforms. If Ofcom fails to act, the Prime Minister must intervene.”
Molly Russell’s death in 2017 was a turning point in the conversation around online safety, particularly when it comes to the risks posed by harmful content on social media. Molly’s family set up the Molly Rose Foundation to push for stronger protections for children online. The Foundation has been calling for tougher regulations to address content related to suicide, self-harm, and eating disorders on social media platforms.
The Foundation’s recent letter to Ofcom urges the regulator to strengthen the Online Safety Act, with an emphasis on content moderation. Specifically, it requests that tech firms be required to proactively scan for all types of harmful content, including intense depression, suicide, and self-harm material. The Foundation also seeks clarity on whether Meta can change its internal policies without proper consultation, after reports suggested that Mark Zuckerberg made the policy changes unilaterally, leaving internal teams “blindsided.”
A spokesperson for Meta responded to the criticism, insisting that the company’s stance on harmful content has not changed. “There is no change to how we define and treat content that encourages suicide, self-injury, and eating disorders,” the spokesperson said. “We don’t allow it, and we’ll continue to use our automated systems to proactively identify and remove it.”
Meta’s spokesperson also highlighted the company’s commitment to safety, stating that they have around 40,000 employees working on safety and security, and that Teen Accounts in the UK automatically limit who can contact teens and the types of content they see. However, critics remain unconvinced, noting that previous research found Meta was responsible for just 2% of industry-wide takedowns of suicide and self-harm content.
The Molly Rose Foundation has repeatedly criticized Meta’s approach, arguing that it is insufficient in tackling the scale of the issue. Ian Russell, Molly’s father and chairman of the Foundation, recently told the Prime Minister that the UK was “going backwards” on online safety. In a letter to Labour leader Sir Keir Starmer, he expressed concern that Ofcom’s approach to the Online Safety Act has “fundamentally failed to grasp the urgency and scale of its mission.”
In response to these concerns, an Ofcom spokesperson said that all platforms operating in the UK, including Meta, must comply with the UK’s online safety laws once the Online Safety Act is fully in force. “Tech firms must assess the risks they pose, including to children, and take significant steps to protect them,” the spokesperson said. “We’ll soon put forward additional measures for consultation on the use of automated content moderation systems to proactively detect this kind of egregious content.”
Ofcom has pledged to hold tech companies to account, with the full force of its enforcement powers if necessary. However, as Meta continues to revise its policies, the Molly Rose Foundation and other campaigners are calling for stronger, more immediate action to prevent harm to children and young people online.