Teen Chat and Chat Avenue are under scrutiny for allegedly failing to protect children from online grooming through open messaging and chat features.
The UK’s online safety regime has entered a more confrontational phase, with Ofcom opening new investigations into Telegram and two chat platforms over suspected failures to protect children from serious harm. The move signals a shift from broad compliance warnings to more direct enforcement against services deemed to pose acute risks under the Online Safety Act.
Ofcom said it is investigating Telegram to determine whether the platform is doing enough to prevent child sexual abuse material from being shared. Separate probes have also been opened into Teen Chat and Chat Avenue, where the regulator says there are concerns that chat functions may be facilitating grooming and other harms to children. According to Ofcom, the providers have not demonstrated sufficient safeguards for UK users despite earlier engagement.
The cases are part of a wider enforcement drive rather than isolated actions. Ofcom has already been pressing file-sharing and file-storage services over child sexual abuse risks, and says some platforms have since introduced automated detection tools, blocked access for UK users, or otherwise changed their systems in response to regulatory pressure. In other cases, investigations have been closed after providers took corrective steps.
That broader context matters. Since the first online safety duties became enforceable, Ofcom has been moving from rule-setting into operational enforcement, testing whether platforms are actually putting in place the systems and processes needed to reduce illegal harms.
In the child safety area, that increasingly means proactive risk management, technical detection measures, and design choices that make it harder for offenders to share abusive material or contact children in the first place.
Ofcom has also made clear that services available in the UK cannot treat these duties as optional. Under the Online Safety Act, companies can face significant financial penalties for failing to comply, and the regulator can ask courts to impose business disruption measures or restrict access where necessary. That gives the current investigations weight beyond the individual platforms involved.
The bigger significance of the latest action is that platform accountability is being judged less on stated policies and more on demonstrable safeguards. The Telegram case in particular shows that even large, globally used platforms are now exposed to direct scrutiny if UK regulators believe child safety risks are not being properly addressed.
Taken together, the investigations suggest that Ofcom is trying to establish a more interventionist model of online safety enforcement, one in which companies are expected to anticipate and reduce harm rather than respond only after it has spread.
Would you like to learn more about AI, tech, and digital diplomacy? If so, ask our Diplo chatbot!
————————————————
