Ofcomhas launched a formal investigation into themessaging app Telegramto determine if it "has failed, or is failing" to tackle child sexual abuse material.
The UK’s online safety regulator initiated its probe after receiving evidence from theCanadian Centre for Child Protection, which alleged the presence and sharing of such illegal content on Telegram.
Following its own assessment, the UK regulator decided to open an investigation into possible failings by Telegram "to comply with its duties in relation to illegal content."
Under the UK'sOnline Safety Act, providers of so-called user-to-user services, such as Telegram, are "required to assess and mitigate the risk of this horrific crime being perpetrated on their platforms."
Ofcom said firms which fail to do what is required of them to protect children will “face serious consequences”.
Suzanne Cater, director of enforcement at Ofcom, said: “Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities.
Advertisement
“It’s why we work so closely with partners in law enforcement and child protection organisations to identify where these harms are occurring and hold providers to account where they’re failing to meet their obligations.
“Progress has undeniably been made, particularly with file-sharing services, which are too often used to share horrific child sexual abuse imagery.
“But this problem extends to big platforms too, and teen-focused chat services are too easily being used by predators to groom children. These firms must do more to protect children, or face serious consequences under the Online Safety Act.”
If failures to comply with the Act were identified, it is possible Ofcom could impose fines of up to £18 million or 10% of qualifying worldwide revenue, depending on whichever is greater.
Ofcom also said in the most serious cases it can seek a court order requiring internet service providers to block access to the service in the UK.
The regulator also announced on Tuesday that it had opened investigations into whether the providers of chat services Teen Chat and Chat Avenue “are taking appropriate steps to assess and mitigate the risk of UK users encountering illegal content and activity, including grooming”.
The watchdog said its work with child protection agencies had raised concerns about the risk to children on the platforms, which both have chatrooms, private messaging, and what it described as media sharing functionalities.