Shein under EU investigation over childlike sex dolls

The European Commission has initiated formal proceedings against global fast fashion retailer Shein under the Digital Services Act (DSA), marking a significant escalation in regulatory scrutiny of the platform’s operations. The investigation will examine multiple alleged violations, including potential failures to prevent the sale of illegal products and concerns regarding platform design that may promote addictive user behavior.

Central to the probe is the examination of systems designed to block prohibited items, with particular attention to content that could constitute child sexual abuse material. This follows previous reports to French authorities regarding the sale of childlike sex dolls on Shein’s platform, which the company states were immediately removed with accompanying seller bans and a complete prohibition on all sex doll sales regardless of appearance.

The investigation will additionally assess the transparency of Shein’s algorithmic recommendation systems and the potential psychological impacts of its interface design. EC spokesperson Thomas Regnier expressed concerns about the ‘gamification’ elements and reward programs that may create addictive patterns, noting that while such features aren’t inherently problematic, their opaque algorithmic implementation raises regulatory questions.

Under DSA provisions, Shein must disclose primary parameters governing product recommendations and provide users with non-profiling based alternatives. The formal investigation enables the Commission to pursue enforcement measures including potential fines of up to 6% of global annual revenue—a figure that could reach approximately $2.28 billion based on Shein’s reported $38 billion in 2024 sales.

Shein has emphasized its cooperative stance with regulators, stating: ‘Protecting minors and reducing harmful content remains central to our operational philosophy. We have invested significantly in enhanced compliance measures, including comprehensive risk assessment frameworks and strengthened protections for younger users.’