Australia has significantly expanded its pioneering social media prohibition by implementing new regulatory measures specifically targeting platforms that employ psychologically manipulative design features. The updated legislation, officially registered this Wednesday, now classifies as ‘age-restricted social media platforms’ those services utilizing algorithms ‘engineered to be addictive’ and providing users with a ‘continuous dopamine stimulus.’
This regulatory enhancement specifically addresses several concerning design elements including infinite scroll interfaces without termination points, feedback mechanisms displaying like counts and upvotes, and ephemeral ‘stories’ functionality that generates artificial urgency among young users. These features have been identified as deliberately fostering ‘fear of missing out’ (FOMO) psychology, compelling constant application monitoring.
The revised framework maintains exemptions for communication and educational platforms including Discord, Google Classroom, WhatsApp, and Roblox, which fall outside the prohibition’s scope. Current enforcement focuses on ten major platforms including Snapchat, Facebook, and Instagram, with the eSafety Commissioner investigating several additional services.
Communications Minister Anika Wells emphasized the particular vulnerability of ‘Generation Alpha,’ noting their prolonged exposure to what she characterized as an ‘addictive dopamine drip’ since acquiring their initial smartphones and social media accounts. ‘Sophisticated algorithms, doomscrolling behaviors, persistent notifications, and harmful popularity metrics are appropriating their attention for substantial daily periods,’ Minister Wells stated, affirming the government’s commitment to ‘illuminating these detrimental and habit-forming features targeting young Australians.’
This regulatory development follows significant international scrutiny regarding social media’s psychological impacts, particularly after a New Mexico judicial ruling determined that Meta’s Facebook operations compromised children’s mental health and safety through state law violations.
Concurrently, Greens Senator Sarah Hanson-Young announced impending legislation titled the ‘Fix our Feeds’ Bill, drawing direct parallels between social media design and addictive substances. ‘This verdict substantiates long-standing concerns: social media platforms are intentionally architected to maintain user engagement, comparable to cigarettes or gambling machines,’ Senator Hanson-Young remarked regarding the American case.
The proposed legislation would empower users to selectively opt out of ‘predatory algorithmic systems,’ thereby enabling safer digital experiences according to Greens representatives. However, Liberal Senator Sarah Henderson delivered critical assessment, asserting the existing social media prohibition has ‘failed to deliver promised outcomes’ and questioning the validity of reported 4.7 million deactivated accounts cited by eSafety authorities.
Senator Henderson characterized the New Mexico decision as ‘deeply condemnatory,’ emphasizing that ‘platform architecture fundamentally seeks to maintain juvenile addiction to detrimental content.’ She further advocated for enhanced algorithmic transparency, stating ‘Australian citizens warrant unambiguous disclosure, and platforms must not operate concealed algorithmic systems when children’s welfare is implicated.’
eSafety Commissioner Julie Inman Grant is scheduled to provide comprehensive compliance updates regarding the social media prohibition next week.
