Call us now 01926 402 498
Discord has announced the launch of a global age assurance programme beginning in the UK and Australia, as the platform faces scrutiny over its alleged role in child sexual exploitation cases connected to Roblox.
Communication platform Discord has confirmed it will begin rolling out a global age assurance programme in the United Kingdom and Australia. Wider international expansion is planned from March 2026. The move follows mounting scrutiny over the platform’s safety systems, after it was named in a number of U.S. lawsuits alleging child sexual exploitation linked to Roblox.
In a support article published on 9 February titled “How to Complete Age Assurance on Discord,” the company outlined how users will verify their age and what information may be required.
Discord, a free platform that enables users to communicate through text, voice and video chats in community “servers,” has grown far beyond its gaming roots. Today, students, creators and online groups use it to host topic-based channels, private messages and live discussions across mobile and desktop devices.
However, over the past year, multiple U.S. civil claims involving alleged child sexual exploitation have referenced Discord. Plaintiffs claim that adult perpetrators initially contacted minors through Roblox before moving conversations to private Discord servers and direct messages.
Legal complaints allege perpetrators used Discord as an off-platform tool to continue grooming, share explicit material and, in some cases, arrange in-person meetings. The lawsuits outline allegations of sextortion, sexual assault and severe psychological harm.
Roblox, launched in 2006, allows users to create and explore player-generated games and interact through in-game chat features. While marketed as family-friendly, the company now faces more than 100 federal child sexual exploitation lawsuits consolidated before Judge Richard Seeborg in the Northern District of California.
Families bringing claims allege predators used inadequate safety systems to pose as peers, groom children and coerce explicit images or meetings. Some cases reference kidnapping, trafficking and long-term emotional trauma.
In one recently filed California state lawsuit, a mother alleges her nine-year-old daughter was groomed by an adult posing as a child through Roblox’s chat functions. The complaint states:
“Had Defendant implemented even the most basic system of screening or age and identity verification, as well as effective parental controls, Plaintiff never would have interacted with these predators and never would have suffered the harm that she did. Plaintiff’s life has been devastated as a direct result of Defendant’s conduct.”
Roblox began rolling out facial age verification technology in September 2025. However, families involved in litigation argue that the company implemented the safeguards too late to prevent harm to earlier victims.
Discord’s new age assurance system will require users to verify their age once. The verified age group will determine access to sensitive content, default safety settings and age-restricted channels.
Users may encounter prompts when attempting to:
Verification methods include:
Discord states that most verification data, including video selfies and ID documents, is processed on the user’s device or deleted shortly after review. Verified age status will appear only within a user’s account settings and will not be publicly visible.
The initial rollout applies to UK and Australian users due to legal requirements. However, Discord has confirmed global expansion beginning March 2026. Users in other regions may see testing prompts as part of phased implementation.
Child protection experts continue to warn that online grooming frequently spans multiple platforms. Perpetrators may use one app to initiate contact before moving conversations to private or encrypted channels elsewhere.
As legal proceedings against Roblox continue and platforms introduce new age verification technologies, safeguarding advocates stress that platforms must pair technological solutions with robust moderation, clear reporting systems and trauma-informed responses.
At Safeline, we support children, young people and adults affected by sexual abuse, exploitation and grooming — including online sexual abuse. If you or someone you know has been impacted, confidential specialist support is available.
Support is free, non-judgemental and survivor-led.
Stay up to date with Safeline’s work by following us on Social Media: