Discord co-founder and chief technology officer Stanislav Vishnevskiy has announced the company is delaying the global rollout of its controversial age verification requirements, admitting it “missed the mark”.
Earlier this month, Discord revealed it would begin a “phased global rollout” of a new update in March that would give all new and existing users a “teen-appropriate experience” by default. Essentially, this meant many of the platform’s features – including access to age-gated channels, servers, app commands, and “sensitive content” – would be heavily restricted unless users had been age verified. This, the company said at the time, would enable it to give “teens strong protections while allowing verified adults flexibility”.
The outcry was swift, with some raising the spectre of last year’s Discord security breach, which saw hackers successfully acquiring an unspecified number of users’ personal data, including government IDs. And Discord has now acknowledged users concerns, pausing its rollout until a later date.
“The way this [announcement] landed,” Vishnevskiy wrote in a new blog post, “many of you walked away thinking we’re requiring face scans and ID uploads from everyone just to use Discord. That’s not what’s happening, but the fact that so many people believe it tells us we failed at our most basic job: clearly explaining what we’re doing and why. That’s on us. On top of that, many of you are worried that this is just another big tech company finding new ways to collect your personal data. That we’re creating a problem to justify invasive solutions… But that’s not what we’re doing.”
“We know many of you believe the right answer is not to do this at all. We hear you,” Vishnevskiy continued. “We also know these changes carry different weight for different communities, and that for some, questions of privacy and identity aren’t just preferences but safety concerns shaped by real experience. That’s not lost on us, and it directly informs the choices we’re making.”
In an effort to offer clarity on Discord’s previous announcement, Vishnevskiy says most users – even those wanting to access age-restricted content – will never need to submit video selfies for facial age estimation, or supply identification documents to Discord’s vendor partners. Rather, the company’s own internal systems will attempt to automatically determine a user’s age, based on the time their account has existed, any payment method on file, the types of servers they’re in, and general patterns of account activity.
Less than 10 percent of users, he claims, will need to undergo alternative age verification methods to access age-restricted content. This may be because Discord’s systems can’t confirm they’re an adult or because local laws – such as those in the UK and Australia – explicitly require facial age estimation or ID checks. For those users needing to be verified by a third party, Vishnevskiy acknowledges there may be scepticism around “how we handle partnerships”, particularly in light of last year’s security breach. However, he insists Discord no longer works with the company involved in that incident and pledges to publish every verification vendor it works with – alongside their data handling practices – while also providing users with choices about the vendors they use.
This vendor transparency will be part of changes Discord is making to its global age verification rollout, which has now been delayed into the second half of this year. Vishnevskiy says the company won’t work with vendors whose facial age estimation is not performed entirely on-device, and that alternative verification options, including credit card verification, will be added before rollout. Elsewhere, it’s launching a new “spoiler” channel for servers that have previously utilised age-restricted channels as a way of separating users from non-adult topics they “may prefer to engage with on their own terms”. It’ll also publish a technical blog before rollout making it clear how its automatic age determination systems work, and will include information – on the number of users asked to age verify, the methods used, and how often Discord’s automated systems handled verification – in its transparency reports.
What it won’t do, however, is halt the rollout entirely. As Vishnevskiy explains, “The number of teenagers on Discord has significantly increased since the pandemic, and they deserve an experience appropriate to their age. At the same time, we believe adults should be able to have a full content experience on Discord. Doing both responsibly means having safeguards that help ensure age-restricted content stays in adult spaces.”
It’s currently unclear how – or indeed if – the changes announced by Vishnevskiy will impact countries like the UK and Australia, which explicitly mandate facial age estimation or ID checks for adult content. Already, Discord has generated a second wave of controversy in the UK after it launched an “experiment” that would see age verification data leaving users’ phones, despite previously stating otherwise. There’s been additional concern over Discord’s decision to use Persona as the vending partner for its “experiment”. Persona has funding ties with Peter Thiel, whose data and surveillance company Palantir is currently used by US federal agencies, including ICE.
