EU Launches Formal DSA Investigation Into Snapchat Over Child Safety | #childsafety | #kids | #chldern | #parents | #schoolsafey


The European Union has opened a formal investigation into Snapchat over potential breaches of Digital Services Act (DSA) regulations concerning child safety.

This investigation could result in significant fines and mandate changes to Snapchat’s operations within the EU, impacting its ability to attract and retain its young user base if violations are confirmed.

Regulators are examining whether Snapchat adequately protects minors from grooming and recruitment for criminal purposes. The inquiry also covers access to information on illegal drugs and age-restricted products by younger users.

Brussels questions Snapchat’s self-declaration age assurance system, which requires users to be at least 13. The European Commission stated this system might not prevent underage users from accessing the platform or confirm users are over 17 for an age-appropriate experience.

Investigators allege adults can exploit the current system to misrepresent their age and impersonate minors. The app reportedly lacks features for users to report suspected underage accounts or to easily report illegal content, also failing to inform users about redress possibilities.

Additional concerns include Snapchat’s “Find Friends” feature recommending child and teen accounts to other users and insufficient guidance on account safety features.

The Commission is gathering evidence, sending interview invitations, and requesting information from Snap. The investigation is based on analysis of Snapchat’s risk assessment reports from the past three years and an information request sent in October 2025.

“The safety and wellbeing of all Snapchatters is a top priority, and our teams have worked for years to raise the bar on safety,” a Snapchat spokesperson said. “As online risks evolve, we continuously review, strengthen, and invest in these safeguards.”

The company also stated it has acted proactively and transparently to meet DSA requirements and will fully cooperate with the Commission’s investigation.

Snap is among several social media companies under increased scrutiny regarding minor safety. In 2023, the company implemented features to limit interaction between teenagers and strangers, such as increasing mutual friend requirements for search and suggested accounts.

Snap, along with TikTok, recently settled a lawsuit alleging social media addiction. The company was not ordered to pay damages in a recent jury ruling against Meta and YouTube in a separate but related case involving a 20-year-old woman.


Featured image credit

————————————————


Source link

National Cyber Security

FREE
VIEW