Ofcom launches first major enforcement action under new Online Safety Act
The United Kingdom's communications regulator, Ofcom, has launched its first significant investigation under the country's new Online Safety Act, placing the messaging platform Telegram and several unnamed "teen chat sites" under scrutiny for potential child safety failures. The move signals a new chapter in tech regulation, where platforms face substantial financial penalties for failing to protect users from illegal content.
The probe into Telegram was initiated after Ofcom received intelligence from the Canadian Centre for Child Protection (C3P). According to Ofcom's press release, the information alleges the presence and sharing of Child Sexual Abuse Material (CSAM) on the platform. This action represents a critical test of the Online Safety Act's power and Ofcom's willingness to enforce it against major international technology companies.
The legal hammer: Understanding the Online Safety Act
Enacted in late 2023 with key powers coming into force in April 2024, the Online Safety Act (OSA) imposes a legal "duty of care" on online services. This requires platforms to actively assess and mitigate the risks of illegal and harmful content, with a particular focus on protecting children. Ofcom, as the appointed regulator, is now armed with formidable enforcement capabilities.
Under the OSA, Ofcom can demand information from companies, conduct on-site audits, and, most notably, levy fines of up to £18 million or 10% of a company's qualifying global annual turnover, whichever is greater. For a platform with Telegram's reach, such a penalty could be financially devastating. This investigation is the first time these powers have been formally directed at a major platform for such serious allegations, setting a precedent for all companies operating in the UK.
Technical breakdown: The platform-problem nexus
Ofcom's investigation will delve into the technical and policy measures Telegram has in place to combat CSAM. The platform's architecture presents specific challenges that are central to the debate over online safety and privacy.
Encryption’s double-edged sword
Telegram is known for its privacy features, including optional end-to-end encryption (E2EE) in "secret chats." While E2EE is a vital tool for protecting user communications from surveillance, it also complicates content moderation. By design, the platform provider cannot access the content of E2EE messages, making it nearly impossible to proactively scan for illegal material within those specific chats. However, much of Telegram's activity occurs in large public and private channels and groups that are not end-to-end encrypted by default, and this is likely where Ofcom's investigation will focus. The OSA does not outlaw encryption but requires companies to use all tools at their disposal to mitigate risks, a mandate that puts pressure on platforms to innovate in safety technology without compromising user privacy.
Content dissemination and moderation
Telegram's channel and group features allow for the rapid distribution of content to thousands, or even millions, of users. This structure can be exploited by malicious actors to quickly disseminate illegal material. The investigation will scrutinize Telegram's content moderation systems, both automated and human. Key questions will include:
- Proactive Detection: Does Telegram use industry-standard hashing technologies (like PhotoDNA or similar databases) to proactively identify known CSAM images and videos before they are widely shared?
- Reporting Mechanisms: How effective and responsive are Telegram's user-facing reporting tools for illegal content?
- Enforcement Action: What is the platform's process for removing confirmed CSAM, banning offending users, and reporting incidents to law enforcement and organizations like the National Center for Missing & Exploited Children (NCMEC)?
Impact assessment: The ripples of regulation
The implications of this investigation extend far beyond Telegram. It serves as a clear warning to the entire tech industry that the era of self-regulation in the UK is over.
For Telegram, the immediate risks are financial and reputational. A finding of non-compliance would not only trigger a massive fine but also severely damage user trust. The company will be compelled to demonstrate its commitment to safety, potentially leading to changes in its features, policies, and transparency reporting.
For other platforms, this probe is a call to action. Companies providing services to UK users are undoubtedly reviewing their own compliance with the OSA, reinforcing their moderation teams, and ensuring their age assurance technologies are effective. The investigation into unnamed "teen chat sites" highlights Ofcom's focus on services specifically targeting younger users, where the duty of care is highest.
Globally, this move aligns the UK with other regulatory efforts like the European Union's Digital Services Act (DSA), which also imposes strict obligations on platforms to tackle illegal content. The cross-border nature of the intelligence—coming from a Canadian agency—underscores the international cooperation required to address a problem that knows no borders.
How to protect yourself and your family
While regulators hold platforms accountable, individuals and parents can take immediate steps to improve online safety. This is not about abandoning technology but using it more safely and consciously.
- Engage in open communication: The most effective tool is conversation. Talk to your children about what they do online, who they talk to, and the importance of not sharing personal information. Create an environment where they feel comfortable coming to you if they encounter something that makes them feel unsafe or uncomfortable.
- Utilize platform safety tools: Take the time to explore the privacy and safety settings on every app and service your family uses. Configure settings to restrict who can make contact, view profiles, and share content. Teach your children how to block and report users.
- Use device-level controls: Both iOS and Android operating systems have built-in parental controls that allow you to manage screen time, restrict app downloads, and filter content. Use them to create a safer digital environment on devices.
- Report illegal content: If you encounter suspected CSAM online, do not investigate or share it. Report it directly to the platform immediately. In the UK, you should also report it to the Internet Watch Foundation (IWF). This helps ensure the material is removed and law enforcement is notified.
- Enhance general online privacy: For an added layer of privacy in your general online activities, using a VPN service can help mask your IP address from websites and services, reducing your digital footprint.
Ofcom's investigation into Telegram is a watershed moment for online safety in the UK. Its outcome will not only determine the future of platform accountability in the country but will also influence the global conversation on how to build a digital world that is safer for everyone, especially its most vulnerable users.




