This isn't really accurate. Age verification is not mandatory for all accounts. You will be able to join a Discord with your friends, chat, and do voice without age verification.
Here's the exact list of what's restricted if you don't verify:
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
Taken from the announcement https://discord.com/press-releases/discord-launches-teen-by-...
So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
For children - this mandate also still makes the decision on behalf of the parents that a child must submit a scan of their face to a third party. Moving to Persona for age verification involves verification data being sent outside of the user's phone - in direct contradiction to Discord's initial promise of keeping facial scan data solely on the phone. Third parties that we've been given no reason to trust will delete the data without using it for an improper purpose such as creating derivative info from the ID or facial scan itself unrelated to the sole purpose of verifying that an individual is an adult.
While we're at it - is there any legitimate reason why Discord is associating a person's actual or estimated age with their account as opposed to storing a value that states if they are or are not an adult? That sort of granularity seems unrelated to the stated purpose.
"Additionally, Discord will implement its age inference model, a new system that runs in the background to help determine whether an account belongs to an adult, without always requiring users to verify their age"[0]
0: https://discord.com/press-releases/discord-launches-teen-by-...
I was 14 once too, that’s how I got into what I do now.
It introduces the threat of being personally unmasked to anyone and everyone is introduced in the event the verification system (or a component thereof containing your personal info) is hacked and data dumped to the public.
It introduces the threat of your data being sold around with the "ground truth" of your identity and photo associated with it.
And even if these threats aren't realized....it happens often enough with related companies that the uncertainty will forever be there.
The threat of public humiliation.
The threat of losing your job.
The threat of losing your social connections.
The threat of personal assault.
All of these come to mind as concrete threats that have played out when someone has been doxed by a malicious person.
And now the risk and consequence of doxing is made so much worse when your government ID is associated with chats that are ostensibly private.
Also - the outcry here isn't from people who think they will no longer be able to use Discord in any way, shape, or form without going through an age verification process. That's a bizarre strawman that doesn't represent the main grievances being aired.
2. If they don't about-face, there's a lot about the implementation that remains to be seen.
Personally, I use discord for things that should be completely unaffected by this. I will not verify my age if there are surprises. I'll leave. If the communities I'm a part of decide to move, I'll support them and move even if I don't run into surprises.
There is absolutely no way we should support giving identifying information to a U.S. company given what's going on right now. The trust is no longer there. If you verify your identity, anything you say on Discord could be used against you if you ever pass through American borders.
Pretty much an AI detecting vulgarity and blocking it, although actual racist, vulgarity gets through things like 'here with my gock' to 'troll it' are what I've seen.
So, yes it is a requirement, and yes, they are censoring people and things, and requiring others to have an ID to see the messages as well.
So 'Not mandatory for all accounts' is technically true, but I mean.. you get it, hopefully.
> You will be able to join a Discord with your friends, chat, and do voice without age verification.
No, building a community is a goal for many; this just isn't acceptable.
> So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
Again, not mandatory but creates more issues than it solves.
I know not everyone is so open but in the lgbt space most people are.
I've been noticing people in this space react to these news in a very worrisome manner, either by downplaying the need of nsfw in their lives (ironically, hours after discussing a clearly-nsfw matter!), or even worse: by equating all nsfw to "porn"; giving them carte blanche to judge others who want the option for nsfw talk as "being in it just for sex".
It's been shocking for me to see this phenomenon unveil in real time. This overwhelming "sanitizing" force that bulldozes through any nuance regarding the nature of being an adult in shared online adult spaces. It's especially rough for marginalized or minority communities, who oftentimes don't even have IRL spaces to talk about adult subject matters.
> >Content Filters:
Sound like something people might not want tied to real-world identities.
> >Age-gated Spaces:
So, #politics in my local instance.
1. So they can use it against you later if they want to (eg. blackmail, spying, etc.)
2. So they can start shutting off access to content that those in power don't like
Calling it now: Reddit is next.
You are correct. For now. But why would they stop there?
Supposedly this is to protect teens. If that's true, why would they continue letting teens chat with anonymous users? What if they get tricked into sharing sensitive images or video of themselves? Surely we need to know everyone's ID to ensure teens aren't unwittingly chatting with a known predator. It's for their safety. But for now that's a bridge too far. For now.
And why should we believe this even has anything to do with protecting teens? That's valuable data. Discord says they're not holding onto it... for now. But Discord is offering quite a lot to users for free. Why let such an obvious revenue source go unmonetized? They're doing this now because they're going public soon. Investors want an ROI and this action is sure to invite some competition. The people leaving want an alternative, so a competitor could get a foothold. Discord needs to stay ahead. And the users Discord keeps after this stunt are going to be the most resilient to leaving - the most exploitable. Surely they wouldn't care if the policy changes in the future.
The sky isn't falling. But the frog is boiling.
All so that we can post online about how Google is invading our privacy?
1. A way for politicians and the state to track porn habits to US citizens and use that information against them in the future. Blackmail for the future politicians, business leaders, and wealthy to coerce them into doing what those in power want.
2. A way for conservatives to tighten the noose around non-chaste materials and begin to purge them from the internet. And if that works, that's hardly the last thing that will go. Next will be LGBT content, women's rights content, atheist content, pro-labor content, and more. (Or if you're on the other side of the political spectrum, consider that the powers could be used to remove Christian content, 2nd Amendment content, etc. It doesn't really matter what is being removed, just that the mechanisms are in place and that powers can put a lid on the populace.)
We aren't screaming loudly enough.
Do not try to sugar coat this with a pedantic mistake.
This is far worse.
It's a first step down a path the Big Brother state wants.
Yell.
Scream.
Protest.
Consider for example what might have happened if there was no anonymity available back when the official party line was that covid couldn't possibly have come from a lab accident and that Hunter Biden's laptop was a plant. People could have been prevented from sharing links to articles that disagreed with the official truth!
This topic really brings out the crazy conspiracy theories.
No, politicians are not using Discord age verification to track constituents' porn habits and blackmail them with it later.
It's a few years too late to call it just a theory when it's already happening. There have loads of "hacks" where nudes and dating profiles with photos are somehow leaked. There's a zero percent chance they somehow decide Discord or specific porn sites are safe, don't worry, just upload those full face and ID scans bro.
https://en.wikipedia.org/wiki/SEXINT
https://en.wikipedia.org/wiki/Sexpionage
https://en.wikipedia.org/wiki/John_Vassall
https://www.news.com.au/lifestyle/real-life/news-life/how-no...
https://www.eff.org/deeplinks/2013/11/nsa-tracking-online-po...
https://www.aclu.org/news/national-security/echoing-dirty-pa...
https://www.theguardian.com/world/2013/nov/27/nsa-files-live...
It's such a successful strategy, even Bitcoin scammers use it:
https://www.kaspersky.com/blog/extortion-spam/25070/
https://edition.cnn.com/2020/03/27/asia/south-korea-telegram...
For years, email spammers have claimed to have tracked victims' porn habits to try to extort them. That's a far cry from actually doing so. (And no, they aren't actually doing it.)
Being condescending doesn't help your case.
Link-bombing me with stories about Bitcoin scammers and South Korean telegram scams has nothing to do with your claim that politicians are using Discord to blackmail people about their porn habits.
If state-level spy agencies wanted to spy on someone's porn habits, they do not need to kindly ask Discord to collect that person's ID.
The first time I ever had a conversation about privacy concerns with anyone was around 1999. I've been hearing this kind of argument ever since then. Meanwhile, the erosion of privacy since back then has been nothing short of staggering.
We're at the point where we have government using Palantir to target the people, yet somehow privacy concerns keep falling on deaf ears and keep producing the same old "government doesn't need this latest privacy-eroding change" knee-jerk non-argument.
No, they might not need it, strictly speaking, but it sure as hell comes in handy, not to mention that it shifts the Overton window and serves as a stepping stone for the next invasion of privacy.
The only sexual habits people should be ashamed of are non-consensual sex and anything underage of course.
But the conservative values are the very reason many people can be blackmailed in the first place.
> For the majority of adult users, we will be able to confirm your age group using information we already have. We use age prediction to determine, with high confidence, when a user is an adult. This allows many adults to access age-appropriate features without completing an explicit age check.
> Facial scans never leave your device. Discord and our vendor partners never receive it. IDs are used to get your age only and then deleted. Discord only receives your age — that’s it. Your identity is never associated with your account.
> We leverage an advanced machine learning model developed at Discord to predict whether a user falls into a particular age group based on patterns of user behavior and several other signals associated with their account on Discord. We only use these signals to assign users to an age group when our confidence level is high; when it isn't, users go through our standard age assurance flow to confirm their age. We do not use your message content in the age estimation model.
I work with corporate privacy all of the time, and there is actually something really interesting going on here. We're basically never allowed to claim legal compliance using heuristics or predictive models. Like, never ever. They demand a paper trail on everything, and telling our legal team that we are going to leave it to an algorithm on a user device would make them foam at the mouth.
They are basically trusting a piece of software to look at your face or ID in the same way that, like, a server at a restaurant would check before serving you alcohol.
I am curious to see if this kind of software compliance in the long run is even allowable by regulators.
Part 3, Chapter 2, Section 12(4) specifies that user-to-user service providers are required to use either age verification or age estimation (or both!) to prevent children from accessing content that is harmful to children. Section 12(6) goes on to state that "the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child."
Part 12, Section 230(4) rules out self-declaration of age as being a form of age verification/estimation.
So I suppose it'll come down to whether or not Ofcom deems Discord's age estimation as "highly effective".
[Part 3, Chapter 2, Section 12(4)]: https://www.legislation.gov.uk/ukpga/2023/50/part/3/chapter/...
This is unrelated, but something I find interesting is that Category 1 user-to-user services (of which Discord is one, as per The Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025) are required by Part 4, Chapter 1, Section 64(1) to "offer all adult users of the service the option to verify their identity (if identity verification is not required for access to the service).".
They have devised a system so lackluster and unverifyable that they can claim they are following the letter without having to turn over anything remotely useful to actually verify or track people's identities.
It really bothered me that so many important projects were relying on a proprietary chat technology instead of using mailinglists or IRC which were more decentralized and under the control of the local admin.
I would like to get back to a situation in which you can participate in group chats for open source projects without these being hosted on closed platforms, but if this results in major open source projects shifting from discord to telegram or whatsapp, then nothing will have been learned.
for open source it's even worse because now your community knowledge base is hosted on a platform you don't control, that can change terms whenever it wants, and that requires account creation to even read. we went from mailing lists (archived, searchable, federated) to this.
the irony is discord is great for exactly what it was originally built for: voice chat while gaming. it's terrible as a replacement for forums and mailing lists, but the network effect pulled everything there anyway. i don't think the fix is another proprietary platform either, it has to be something with open protocols and local data ownership.
SimpleX seems trustworthy enough, with thoughtful design decisions, even if it fails my "forced tor" requirement. I haven't spent the time to dive into Session's architecture, but it's on my to-do list, currently the marketing copy makes it look like the best choice.
Hmmm. I feel like self-hosting is the FASTEST way to lose your anonymity. Your self hosted service is MUCH more easily tied to your identity than some third party like discord.
Just imagine you set up a self-hosted forum where you want to discuss something you want to keep private, but the government is very interested and wants to know who you are talking to.
Well, now they know any IP address connecting to your forum is a person of interest. They don't need to decrypt anything to know you are talking to each other.
By using something unique, you are going to make yourself uniquely identifiable.
I’ll be building a new platform on these two technologies and using Zoom or something else like Jitsi on the side for video/audio sharing.
It’s time accept the loss of “features” and go back to something simpler but also something that can still be here in 38 years — like IRC has been.
I guess I have a hard time understanding these calls to switch to a platform that has even fewer features than the unverified Discord accounts. The blog post is incorrect in claiming that verification will be mandatory. It will only be necessary to access certain features and content. For simple IRC-style chats or even for voice chats with gaming friends, no verification is required.
The average Discord user, or even the 98th percentile user, isn’t going to be looking to switch to a platform that isn’t a replacement for the features they use. They’re just going to not verify their accounts and move on.
Communities aren’t about the “platform features” they’re about the environment. As for profit CEO after CEO fail to recognize time after time
Some people are, but I would bet money on it being a very small number of people who switch platforms. The HN bubble is not representative of the average user.
This is similar to when HN thought Reddit's userbase was going to shrink after the API changes (it didn't) or when the internet thought Netflix was going to lose subscribers when they cracked down on account sharing (they grew, not shrank).
A few blog posts about people switching to IRC or setting up their own Matrix servers in protest isn't representative of a mass movement.
Coming from a former heavy IRC user who's not going back except for nostalgia trips.
Things like image embeds, "markdown lite" formatting, and cross-device synchronization are now considered table stakes. There are always going to be some EFnet-type grognards who resist progress because reasons, but they should be ignored.
IRCv3 and Ergo support some of what's needed already (and in a backwards-compatible way!) but client support just isn't there yet, particularly on mobile.
One other feature that's absolutely considered table stakes now is persistent server-side history, with the ability to edit and delete messages. Modern chat services are less like IRC, and more like a web forum with live updates.
(Yes, you can poorly emulate server-side history on IRC with a bouncer. That's not enough, and it's a pain for users to set up.)
In fact this is the reason some irc networks blocked matrix bridges at first (they now have settings to disable this)
I'm not saying mainstream people should use IRC though. Matrix is better for that.
Telegram lets group admins choose whether members can see history from before they join, which is the perfect solution (IMO).
Indeed.
Ergo offers server-side history but I'm not sure it supports edit/delete yet.
I thought age verification was only required to access "adult" content?
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
Does this mean that in panel-like settings where 100s of users are listening to a speaker, in order to ask or contribute in voice you need to be verified?
https://support.discord.com/hc/en-us/articles/1500005513722-...
Hell if I know why unverified users are allowed to speak in normal voice channels but not in stage channels.
https://soatok.blog/2025/07/24/against-the-censorship-of-adu...
Watching as things play out, I understand why people try to target discord et al. with their complaints about the loss of anonymity. Being a tiny minority they have no hope to influence their governments because the opposite position is widely popular.
Therefore, they try to convince commercial entities to disregard these laws as much as possible. This is particularly useful for that niche since fighting legislation cannot in itself be done anonymously. Therefore, they attempt to transform a very nonymous (haha) entity to do the fighting on their behalf. If the attempt fails, no harm befalls them.
I think it's a doomed endeavour. To get users on discord, it has to be portrayed to parents as a safe and legal service. The days of underground BBSes are gone. Now, if your brand gets associated with anything negative you're toast. And realistically the anonymous users are kind of useless as a whole. They won't pay, so they're practically just a drag on your platform. Losing them risks not very much.
Overall, a fight with a bygone conclusion. If you want anonymity you have to use other tools and be aware that simply using those tools marks you out as someone who desires anonymity.
I feel like it has always been on this path to capture more and more of your data and personally link it to who you are.
Their DPO ignored a PII leak I discovered and reported last year. Their dpo mail address just creates a zendesk ticket, I was able to view the ticket was locked and marked "solved" with no response a few days later.
So, I brought it to the Dutch DPA, who were very responsive, and on the same day as their "final update" email, my nearly-decade old Discord account was suddenly "suspended" hours later. The PII leak, which had been ongoing for over a year before my discovery at that point was suddenly stopped the same day. Funny how that works.
It took 5+ months for Discord's DPO and informal disputes team to finally get back to me after informing them of the retaliation, with irrelevant copy & paste templates giving me walk through guides on how to file a "trust and safety" ticket.
When filing a ticket with "trust and safety" under appeal categories I get an automated "please appeal your ban through the app! I am now closing this ticket" response and my ticket's locked once again. And of course, appealing through the app gives me a generic system error.
There are those that will stay on Discord because the benefits of the first three outweigh the degradation of privacy. Then there are those that will leave because the first three aren't important enough to outweigh the privacy loss. There will be all sorts of people in between.
HN has a rather amplified showing of folks who won't trust anything unless it's completely decentralized using E2EE clients verifiably compiled from source that they'be personally audited running on hardware made from self-mined rare metals. The reality is that there is a spectrum of folks out there all with different preferences and while some folks will leave (in this case) Discord, others will remain because thats where the folks they want to chat/game/voice with.
Back when I played games one friend in our group was banned from LoL arbitrarily so the whole group switched to Dota 2.
Honestly, all of these are documented probabilities at this point. SNS owners can do very decent predictions on what will happen if they introduce certain kind of friction. Also, it’s not 2005 anymore, people are used to upload their IDs everywhere. I mentioned it before as well, if you’ve used any large app, the chances are you’ve uploaded your ID (AirBnB, Tinder, and etc.)
This is a lie, this only affects you if you want to view porn/nsfw channels on discord. I'm in the UK happily using it without age verification.
Edit: it does look a little too corporate for me though with the 'book a demo' and the focus on my 'mission'. Doesn't really give hanging out with friends vibes. Just saying.
docker run --name ircd -p 6667:6667 inspircd/inspircd-docker
What's really more distressing is that it got this far before people figured out the game--maybe we should be reflecting on that part, the gullibility and the enabling of those people by those who knew better.
In the case of an online-based ID check, even with nice looking privacy terms, there is no guarantee that your ID won't be stored forever and/or re-analyzed many times cross-checking with other services, and worse leaked.
It will reduce attacks on and abuse of people, because those are usually founded on anonymity (no fear for repercussions etc.)
I don't mind having a platform where everyone is at least somehow verified. yes, sure, you can bypass it and it is not 100% foolproof but what ever is? It raises the barrier for abuse and that's a good thing IMHO
And not just that event: Parents are roasting Roblox for kids getting groomed, but after the relationship is initiated, the groomers always immediately the convo to Discord.
Image what will happen post-IPO.
Did they forget it's proprietary, and from the same person that made OpenFeint, which also had a privacy lawsuit?