European Union regulators have initiated a formal investigation into Snapchat, alleging the social media platform has failed to adequately safeguard minors from online predators and harmful content. The probe centers on concerns that Snapchat’s age verification systems are insufficient and may be exposing young users to serious risks including sexual exploitation and criminal recruitment.
The European Commission, the EU’s executive arm, announced Thursday that Snapchat appears to be violating the bloc’s landmark Digital Services Act (DSA), which mandates stringent user protection measures for online platforms. Regulators specifically questioned the effectiveness of Snapchat’s ‘age assurance’ mechanisms, noting they suspect the system is inadequate at preventing underage access despite the platform’s requirement that users be at least 13 years old.
Commission officials expressed particular concern that the platform fails to properly distinguish between users under and over 17, potentially exposing teenagers to inappropriate content. The investigation will also examine whether Snapchat’s systems adequately prevent adults from impersonating minors and whether the platform sufficiently protects young users from contact with malicious actors.
Additionally, regulators raised alarms about Snapchat’s apparent failure to restrict minors from viewing content promoting illegal or age-restricted products including drugs, vaping devices, and alcohol.
Henna Virkkunen, the Commission’s Executive Vice President for tech sovereignty, security and democracy, stated that Snapchat ‘appears to have overlooked’ the DSA’s rigorous safety standards designed to protect all users, particularly children.
In response, Snapchat issued a statement emphasizing its commitment to user safety, noting it has ‘fully cooperated’ with regulators through ‘proactive, transparent engagement.’ The company maintained that user well-being is a ‘top priority’ and that its platform incorporates ‘privacy and safety built in from the start, including additional protection for teens.’
The investigation represents the latest regulatory action against social media platforms concerning child protection. This development coincides with increased scrutiny on both sides of the Atlantic, following recent US court rulings holding tech companies accountable for harms to young users.
The DSA empowers EU regulators to impose substantial penalties for violations, including fines of up to 6% of a company’s annual global revenue. The investigation will now proceed with Snapchat having opportunity to respond to the allegations before any final determination is made.
