Tech platforms may soon face tougher regulations to prevent illegal content from going viral and to enhance protections for children online, according to proposals outlined by Ofcom, the UK’s communications regulator. The measures, detailed in a consultation launched on Monday, aim to address evolving online risks and build on existing UK online safety laws.
Ofcom’s proposals focus on three key areas: halting the spread of illegal content, tackling harms at their source, and strengthening safeguards for children. Specific measures include requiring platforms to implement mechanisms for users to report livestreams depicting imminent physical harm and mandating larger platforms to proactively detect terrorist material or content harmful to children.
Oliver Griffiths, Ofcom’s online safety group director, said: “We’re holding platforms to account and launching swift enforcement action where we have concerns. But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online.”
The proposals address issues such as intimate image abuse and the risks associated with livestreams, with varying requirements based on platform size and risk level. For instance, all user-to-user platforms allowing livestreaming would need reporting mechanisms, while only the largest platforms would be required to use proactive detection technology for harmful content.
Some platforms have already taken steps to address these concerns. In 2022, TikTok raised its minimum age for livestreaming from 16 to 18 following a BBC investigation revealing children begging for donations on livestreams from Syrian refugee camps. YouTube recently announced it would increase its livestreaming age limit to 16, effective 22 July.
However, critics argue the proposals highlight flaws in the UK’s Online Safety Act. Ian Russell, chair of the Molly Rose Foundation, established in memory of his daughter Molly, who died by suicide after viewing harmful online content, called the measures “sticking plasters” and urged for a strengthened Act to address systemic issues. Leanda Barrington-Leach of children’s rights charity 5Rights echoed this, advocating for holistic safety measures embedded in platform design. Conversely, Rani Govender from the NSPCC welcomed the focus on livestreaming safeguards, noting their potential to protect children in high-risk online spaces.
The consultation, open until 20 October 2025, seeks feedback from service providers, civil society, law enforcement, and the public. Ofcom’s efforts come amid ongoing investigations into nine companies, including Pornhub, for compliance with the Online Safety Act, and follow recent moves by platforms like Pornhub to introduce government-approved age checks in the UK.
The BBC has approached TikTok, Twitch, and Meta (which owns Instagram, Facebook, and Threads) for comment.
Note For Readers:
The CEO handles all legal and staff issues. Claiming human help before the first hearing isn't part of our rules.
Our system uses humans and AI, including freelance journalists, editors, and reporters.
The CEO can confirm if your issue involves a person or AI.