Offsite and the Future of Discord
Posted: Sun Feb 22, 2026 12:45 am
Hello minions. I wasn't really sure how or where this should go, but I figured this is my soapbox to talk/open a conversation about this, so this is where I'll do it. Trending the line between Admin and Pharaoh, but given this could lead to a significant change for all of us, it is vital for us to think about the future of the offsite community and what is happening with Discord.
As you all may be aware of by now (if not, I'm more than happy to be the informer), Discord made some significant changes to their user policy earlier this month that will effect everyone. They are taking steps to require people to verify their age by either submitting to a AI face scan or submit a form of identification to them, otherwise limiting the types of content available to accounts. They have been doing this in specific countries before February, but this particular move is in response to legislation in the United States attempting to sunset Section 230 of the Communications Act of 1934 (amended in 1996) that provides some legal immunity for Internet Service Providers and their users from being sued for what they say or do online. While there is always the looming motive of "make money" over the decisions of every social media platform, Discord's claim is that they are doing this in the name of teen online safety and to protect users from harmful content. However, they are also creating a third party database that they do not have direct control over, that they are not promising direct accountability for, and cannot guarantee will not be selling this data to other parties.
I'll make clear my opinion: I do not think anyone should be handing their IDs or scanning their face to Discord and I am advising against anyone doing so for any reason. Discord has already implemented this feature in several countries, and the results have not been great. After this was implemented in the UK, it took less than 2 months for Discord's third party data service to leak 70k IDs, along with 1.5 terabytes of other user data. Discord has not proven they can keep this data safe, and there's no evidence that policies like this actually protect their users from illegal content instead of merely pushing them to other, less protected spaces.
This, as a region, leaves us with some options for the future of our offsite community: we can either stay on Discord, see how this policy is implemented, and make our assessment after, or we can begin the work of transitioning to a different platform for our offsite focus. There are more than a few potential options, with Stoat and Matrix being two that have been brought up to me specifically. The first is an easy choice for sure, and, at least if Discord isn't lying, it will barely effect most users. But it also opens the door for them to implement it for everyone. A move to another platform is a lot of work, it means rebuilding our server from the ground up, finding volunteers to be new mods, potentially losing the history we've established with our many years on Discord's platform.
That all said, I think moving is our best option to keep Discord or its third party vendors from threatening the privacy of our users. I am, however, unsure where to go, as I do not know enough about the options, and I also do not want to make such a drastic move without consulting our larger community. So I open the floor to all of you: I want your thoughts, your knowledge, and what you think we should do next. Thanks for reading.
Things to read regarding the current state of Discord and the direction it's going:
As you all may be aware of by now (if not, I'm more than happy to be the informer), Discord made some significant changes to their user policy earlier this month that will effect everyone. They are taking steps to require people to verify their age by either submitting to a AI face scan or submit a form of identification to them, otherwise limiting the types of content available to accounts. They have been doing this in specific countries before February, but this particular move is in response to legislation in the United States attempting to sunset Section 230 of the Communications Act of 1934 (amended in 1996) that provides some legal immunity for Internet Service Providers and their users from being sued for what they say or do online. While there is always the looming motive of "make money" over the decisions of every social media platform, Discord's claim is that they are doing this in the name of teen online safety and to protect users from harmful content. However, they are also creating a third party database that they do not have direct control over, that they are not promising direct accountability for, and cannot guarantee will not be selling this data to other parties.
I'll make clear my opinion: I do not think anyone should be handing their IDs or scanning their face to Discord and I am advising against anyone doing so for any reason. Discord has already implemented this feature in several countries, and the results have not been great. After this was implemented in the UK, it took less than 2 months for Discord's third party data service to leak 70k IDs, along with 1.5 terabytes of other user data. Discord has not proven they can keep this data safe, and there's no evidence that policies like this actually protect their users from illegal content instead of merely pushing them to other, less protected spaces.
This, as a region, leaves us with some options for the future of our offsite community: we can either stay on Discord, see how this policy is implemented, and make our assessment after, or we can begin the work of transitioning to a different platform for our offsite focus. There are more than a few potential options, with Stoat and Matrix being two that have been brought up to me specifically. The first is an easy choice for sure, and, at least if Discord isn't lying, it will barely effect most users. But it also opens the door for them to implement it for everyone. A move to another platform is a lot of work, it means rebuilding our server from the ground up, finding volunteers to be new mods, potentially losing the history we've established with our many years on Discord's platform.
That all said, I think moving is our best option to keep Discord or its third party vendors from threatening the privacy of our users. I am, however, unsure where to go, as I do not know enough about the options, and I also do not want to make such a drastic move without consulting our larger community. So I open the floor to all of you: I want your thoughts, your knowledge, and what you think we should do next. Thanks for reading.
Things to read regarding the current state of Discord and the direction it's going:
- Discord announcement of "Teen by Default": https://discord.com/press-releases/disc ... s-globally
- Ars Technica: Discord Says hackers stole government IDs of 70,000 users: https://arstechnica.com/security/2025/1 ... 000-users/
- Electronic Frontier Foundation: Section 230 https://www.eff.org/issues/cda230
- Ars Technica: Discord faces Backlash https://arstechnica.com/tech-policy/202 ... 70000-ids/
- Electronic Frontier Foundation: Discord Voluntarily Pushes Mandatory Age Verification https://www.eff.org/deeplinks/2026/02/d ... ata-breach
- The Rage: Hackers Expose Age Verification Software Powering Surveillance Web https://www.therage.co/persona-age-verification/