Discord announced on Monday that it’s rolling out age verification on its platform globally starting next month, when it will automatically set all users’ accounts to a “teen-appropriate” experience unless they demonstrate that they’re adults.
Users who aren’t verified as adults will not be able to access age-restricted servers and channels, won’t be able to speak in Discord’s livestream-like “stage” channels, and will see content filters for any content Discord detects as graphic or sensitive. They will also get warning prompts for friend requests from potentially unfamiliar users, and DMs from unfamiliar users will be automatically filtered into a separate inbox.
Direct messages and servers that are not age-restricted will continue to function normally, but users won’t be able to send messages or view content in an age-restricted server until they complete the age check process, even if it’s a server they were part of before age verification rolled out. Savannah Badalich, Discord’s global head of product policy, said in an interview with The Verge that those servers will be “obfuscated” with a black screen until the user verifies they’re an adult. Users also won’t be able to join any new age-restricted servers without verifying their age.



Are you saying Discord does not have a problem with sexual predators on children?
No. That is an issue for all online social spaces to contend with. It wont be solved by blocking mature content servers/channels.
“Won’t someone please think of the children” is the pablum that you will accept so they can monetize your biometric data with the data you create and consume on their platform.
You really think they give a shit about kids safety?
This will do absolutely nothing for that unless they outright ban DMs for unverified users