Britain’s communications regulator, Ofcom, launched an investigation yesterday into the Telegram messaging app after evidence suggested child sexual abuse material was being shared on the platform.
The probe is part of UK’s efforts to crack down on children being exposed to harm online without clear accountability. While the country’s 2023 Online Safety Act has set tougher standards for social media platforms such as Facebook, YouTube and TikTok, Prime Minister Keir Starmer wants them to go further.
The government has been consulting on a potential social media ban for children under 16, and Starmer met last week social media company executives where he asked them to take more responsibility.
Ofcom said it had received evidence from the Canadian Centre for Child Protection regarding the alleged sharing of child sexual abuse material on Telegram, and had carried out its own assessment of the platform.
“In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content,” Ofcom said in a statement.
Telegram said it “categorically” denied Ofcom’s accusations, adding that since 2018 it had “virtually eliminated” the public spread of child sexual abuse material on its platform through detection algorithms.