UK regulator Ofcom has launched a formal investigation into messaging app Telegram over concerns it may be failing to prevent the sharing of child sexual abuse material (CSAM) on its platform, as part of a broader push to enforce the country’s Online Safety Act.
The investigation, opened on Tuesday (21 April)examines whether Telegram is complying with its legal duties to assess and mitigate risks of child sexual exploitation and abuse.
Evidence cited by Ofcom includes reports from the Canadian Centre for Child Protection suggesting the alleged presence and sharing of CSAM on the app.
Ofcom is simultaneously investigating two teen-focused chat sites, Teen Chat and Chat Avenue, over risks of online grooming, including predators coercing children into sending sexual images, sexual extortion (sextortion), and arranging in-person abuse.
The sites also face scrutiny for potentially exposing children to harmful content such as pornography.
Child safety a top priority
The moves come under the UK’s Online Safety Act, which imposes duties on platforms providing services to UK users to protect people, especially children, from illegal content.
Sharing or possessing CSAM is illegal in the UK, and platforms classified as “user-to-user” services must take “appropriate steps” to identify and reduce associated risks.
“Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities,” said Suzanne Cater, Ofcom’s Director of Enforcement.
“It’s why we work so closely with partners in law enforcement and child protection organisations to identify where these harms are occurring and hold providers to account where they’re failing to meet their obligations.”
Cater added that while progress has been made with some file-sharing services — several of which have implemented perceptual hash matching to detect CSAM or withdrawn services from UK users – the issue persists on larger platforms and teen chat services. “
“These firms must do more to protect children, or face serious consequences under the Online Safety Act,” she said.
Potential penalties and process
If Ofcom finds a breach, the process involves issuing a provisional decision, giving the company a chance to respond, before a final ruling.
Non-compliance could result in significant fines, up to £18 million or 10% of the company’s qualifying worldwide revenue, whichever is greater, as well as requirements to take specific remedial actions or, in extreme cases, court orders for business disruption measures such as blocking access via ISPs or withdrawing payment services.
Ofcom has already demonstrated its willingness to act, having issued fines in prior cases, including against adult sites for inadequate age checks and 4chan for failures related to illegal content.
The regulator stressed that the Online Safety Act applies to any service accessible to UK users, regardless of where the company is based, but does not require platforms to restrict content for users outside the UK.

Leave a Reply