Content notification on self-harm, child sexual abuse and exploitation
The Canadian federal government is reportedly concerned about the extent to which violent extremists, white nationalists, and pedophiles are using Roblox and other online platforms to target and recruit children.
The report comes from publication The Logic, obtained a December 2025 Public Safety Canada (PSC) brief. The said brief was obtained by the publication through an access to information request, and it was prepared for PSC by the Canada Centre, which leads the government’s anti-radicalization work.
It’s the latest in a years-long string of escalating investigations into allegations of mass child abuse on Roblox. It’s notable however that the Canadian government is also expressing concern over the potential for extremist recruitment to causes like white nationalism.
The PSC report, according to The Logic, explains that Roblox’s young user base, social media-like features, and abundance of user-generated content “create unique vulnerabilities and risks” for its younger demographic.
“As such, Roblox may impact youth radicalization in unexpected ways,” the report concluded.
Inadequate Content moderation was another issue pointed out in the report, The Logic says, claiming that it has made the platform a haven for extremist and violence-promoting communities. The report adds that there’s an “evident risk” that children using Roblox can be lured to other platforms, including Discord and Snapchat, where they are also vulnerable to abuse. The report also found that extremist and violent groups were using Roblox to “identify and victimize new targets for child exploitation as part of their strategy.”
“Safety is at the core of everything we do at Roblox, and we have zero tolerance for child endangerment or extremism on our platform,” Roblox chief safety officer Matt Kaufman told The Logic. “Roblox has a multi-layered safety system that includes advanced AI-powered detection, monitoring teams, 24/7 moderation, and robust user reporting tools.”
Criticism and litigation around child safety in Roblox continue to increase
Earlier this month, Roblox parent company Roblox Corp. rolled out age-based accounts for children and teens, prohibiting access to certain experiences and communication features.
For the last couple of years, the company has been widely criticized and litigated against for allegedly failing to protect children from predators and other bad actors. The most recent lawsuit, which was filed in February, comes from the LA County over alleged “business practices that endanger and exploit children.” Also in February, Australian communications minister Anika Wells questioned Roblox’s PG rating over child grooming concerns.
In November, Roblox Corp. CEO David Baszucki clashed with reporters when pressed about child safety. Previously, Baszucki said during an interview, “If you’re not comfortable, don’t let your kids be on Roblox.”
The warning reported by The Logic comes as the federal government considers banning children under the age of 16 from social media. While that ban, if implemented, is expected to target major social media platforms such as Instagram and TikTok, popular games aimed at children that include social media-like features are “a bit of a different animal,” Culture Minister Marc Miller told The Logic.
“The gaming industry is different than other platforms, and the more that they become sort of social media-ish, the more they expose themselves to responsibility and potentially regulation,” he said Wednesday. Miller is expected to table legislation aimed at improving online safety for children later this year.
————————————————
