Discord has announced that starting in early March, all of its more than 200 million monthly active users worldwide will be placed into a teen-restricted mode by default unless they verify their age through facial recognition or government identification. The platform's new policy, described as teen-by-default settings, means that unverified accounts will face significant restrictions including blurred sensitive content, blocked access to age-restricted channels and servers, and direct messages from unknown users being routed to a separate request inbox. The global rollout marks one of the most sweeping age verification mandates ever imposed by a major social platform.
Users who wish to restore full access will need to complete a one-time verification process using either facial age estimation, which processes video selfies on the user's device, or by submitting a government-issued identification document to one of Discord's vendor partners. Discord stated that video selfies used for facial estimation never leave the user's device and that identity documents are deleted shortly after the verification process is completed. Savannah Badalich, Discord's Head of Product Policy, said the rollout builds on the platform's existing safety architecture and is designed to create a safer experience for users over the age of 13.
The announcement has sparked significant backlash from privacy advocates and users who are uncomfortable providing biometric data or government identification to the platform. The concerns are amplified by a security incident in October 2025 when hackers breached a third-party vendor used by Discord for age-related appeals, potentially exposing government identification photos of approximately 70,000 users. Critics argue that requiring facial scans or identification creates a surveillance infrastructure and that the prior breach demonstrates Discord cannot guarantee the security of such sensitive data.
Discord's move comes amid mounting regulatory pressure on social media companies to protect minors. Australia passed legislation in 2025 banning social media for users under 16, and several American states have enacted or proposed similar age verification requirements. The European Union's Digital Services Act also mandates enhanced protections for minors on large platforms. Discord has faced particular scrutiny because of the platform's popularity among teenagers for gaming and socializing and its history of incidents involving the exposure of minors to inappropriate content in poorly moderated servers.
The platform will also deploy an internal age inference system that runs in the background to determine account age status, reducing the need for repeated manual verification. Additional verification methods beyond facial estimation and identification submission are expected in the future. Industry analysts noted that while the intent to protect minors is sound, the execution raises difficult questions about the trade-off between child safety and user privacy that every major platform will increasingly have to confront as regulatory requirements tighten worldwide.
Comments