‘A game-changing moment for social media’ – what next for big tech after landmark addiction verdict?

A groundbreaking jury verdict in Los Angeles has delivered a seismic blow to tech giants Meta and Google, ruling their platforms Instagram and YouTube deliberately engineered addictive features while negligently failing to protect young users. The court ordered both companies to pay $6 million in damages to Kaley, a plaintiff who developed severe body dysmorphia, depression, and suicidal thoughts after using the platforms.

The ruling represents a potential watershed moment for social media regulation globally. Legal experts describe it as ending the ‘era of impunity’ for technology companies that have historically operated with limited liability for user harm. Despite immediate appeals from both defendants—with Meta arguing no single app bears sole responsibility for teen mental health crises, and Google disputing YouTube’s classification as a social network—the verdict establishes critical precedent.

Internal whistleblower testimony proved damning during proceedings. Former Instagram executive Arturo Bejar revealed he had warned CEO Mark Zuckerberg years ago about the platform’s dangers to children, stating the service evolved from ‘a product you used to a product that uses you.’ Meta has denied these allegations.

The case exposes fundamental tensions between engagement-driven business models and user welfare. Social platforms rely on infinite scrolling, algorithmic recommendations, and autoplay features to maximize advertising exposure—practices now facing unprecedented legal scrutiny. While TikTok and Snap settled similar claims pre-trial, Meta and Google invested enormous resources in their defense, indicating the verdict’s profound commercial implications.

Globally, regulatory momentum is building. Australia has already implemented under-16 social media bans, while the UK parliament debates similar restrictions through the Children’s Schools and Wellbeing Bill. This verdict strengthens arguments for age-gated access worldwide, with bereaved parents like Ellen Roome—whose son died following an online challenge—demanding immediate action.

Legal scholars compare this moment to Big Tobacco’s historical reckoning, suggesting mandatory health warnings, advertising restrictions, and potential revisions to Section 230 protections that shield tech companies from content liability. As dozens of similar lawsuits advance through US courts, this ruling fundamentally redefines accountability standards for digital platforms engineered for maximum engagement.