Discord is instituting a comprehensive global age verification system requiring facial scans or identity documentation for users seeking access to adult-oriented content. Starting in early March, the communication platform with over 200 million monthly active users will mandate these checks worldwide, expanding existing protocols currently limited to the UK and Australia.
The verification process offers two pathways: users may submit photographic identification or complete a video selfie analyzed by artificial intelligence to estimate facial age. Discord emphasizes that biometric data from facial scans will not be retained, and uploaded identification documents will be promptly deleted following verification completion.
This initiative establishes a ‘teen-appropriate experience’ as the platform’s default setting, fundamentally altering content visibility and communication capabilities. Verified adults will gain access to age-restricted communities and sensitive material, while unverified users will face limitations in both content visibility and direct messaging functionality.
Savannah Badalich, Discord’s Head of Policy, stated: ‘Our safety initiatives prioritize teenage users above all. Implementing teen-by-default settings globally enhances our existing protective architecture while maintaining flexibility for verified adults.’
The announcement follows Discord’s appearance at a contentious 2024 US Senate hearing regarding child safety measures, placing the company alongside other social media giants facing increased regulatory scrutiny. Industry analyst Drew Benvie of Battenhall noted that while the safety intention is commendable, implementation across Discord’s millions of communities presents significant operational challenges.
Privacy advocates have expressed concerns about data security, particularly following an October incident where approximately 70,000 user identification images were potentially compromised through a third-party verification provider. The platform’s safety overhaul coincides with reports of potential public share offerings and mirrors similar protective measures adopted by Meta, TikTok, and Roblox.
