In the wake of US social media verdicts, a look at what limits other countries have imposed for kids

A series of landmark legal decisions this week has intensified global scrutiny on social media platforms’ impact on youth mental health. In consecutive rulings, Los Angeles and New Mexico juries found Meta and YouTube legally responsible for harms inflicted on children through their services, marking a significant validation of long-standing concerns about digital platform dangers.

Despite these judicial victories, child safety advocates emphasize that without comprehensive federal legislation, meaningful change remains elusive. The current regulatory vacuum in the United States contrasts sharply with aggressive measures being implemented worldwide to protect young digital citizens.

Australia has emerged as a pioneering force, establishing the first nationwide prohibition barring children under 16 from social media platforms. The legislation imposes severe penalties—up to AU$50 million ($34 million)—for non-compliant companies, though questions persist about implementation methods and potential impacts on privacy rights.

Brazil has enacted groundbreaking legislation requiring minors under 16 to link social accounts to parental supervision and banning addictive design features including infinite scroll and autoplay videos. The South American nation now mandates robust age verification systems surpassing simple self-declaration.

Indonesia is poised to become Southeast Asia’s first nation to restrict social media access for under-16 users, targeting ‘high-risk’ platforms including TikTok, YouTube, and Facebook. The phased implementation begins March 28, 2024.

European nations are advancing similar protections. France has approved legislation banning social media for under-15s and prohibiting mobile phones in high schools, while Spain plans to restrict access for those under 16. Denmark is pursuing comparable measures, and the UK is considering teenage social media bans as part of enhanced child protection frameworks.

Malaysia has introduced licensing requirements for major platforms, compelling them to implement age verification and content safety measures as part of broader digital oversight strengthening.

These international developments highlight a growing consensus that platform self-regulation is insufficient to address the complex challenges of protecting children in digital environments. As nations increasingly adopt legislative solutions, pressure mounts on US lawmakers to advance the stalled Kids Online Safety Act, which gained Senate approval but has since languished without full congressional adoption.