On July 29, the family of a 13-year-old girl filed a lawsuit against Roblox. According to the suit, the child “was introduced to an adult predator on Roblox unbeknownst to her parents, groomed, and then kidnapped from her grandmother’s West Des Moines, Iowa home on May 24, 2025, where she was subsequently trafficked across multiple states, and repeatedly sexually abused and raped.”
This isn’t an isolated story. According to Bloomberg, which ran a long feature on the “Roblox Predator Problem” last year, U.S. police had “arrested at least two dozen people accused of abducting or abusing victims they’d met or groomed using Roblox.” And on August 14, Louisiana sued Roblox, arguing the company is “accountable for the facilitation and distribution of child sexual abuse material (‘CSAM’) and for the sexual exploitation of Louisiana’s minor children.”
Louisiana’s lawsuit comes at a time when lawmakers are grappling with how to make the internet safer for children. In May, Republican Sen. Marsha Blackburn of Tennessee reintroduced her 2022 Kids Online Safety Act—co-sponsored by Democratic Sen. Richard Blumenthal—which, if passed, would create a “duty of care” requirement for all online platforms, social media companies and video games alike. A total of 41 senators—24 Republicans and 17 Democrats—have signed on to cosponsor the legislation.
The same month, Sen. Mike Lee of Utah and Rep. John James introduced the App Store Accountability Act to implement age-verification requirements for app stores, require that users under 18 years old receive parental consent to access the app store or download an app, and automatically link minors’ devices and app store activity to their parents or guardians’ account. Earlier, in February, Lee also introduced the SCREEN Act to require age verification for access to websites “that make available content that is harmful to minors,” such as pornography. The U.K. passed similar legislation in 2023 with the Online Safety Act, which requires platforms to “prevent children from accessing harmful and age-inappropriate content,” as well as properly label such content for adult users. The penalty for violations is steep: “Companies can be fined up to £18 million or 10 percent of their qualifying worldwide revenue, whichever is greater,” a U.K. government website on the law states. Since the law took effect in July, VPN downloads have topped the U.K. Apple App Store chart, and some users have used faces from video game characters to bypass age verification.
Jessica Melugin, director of the Competitive Enterprise Institute’s Center for Technology and Innovation, explained that, while she believes these efforts are well-intentioned, she doubts they will prove to be effective and could do more harm than good. “This is a brilliant strategy to chill America’s free speech and a sincere effort, probably, to protect children,” she told TMD. “It’s kind of ridiculous to think that sites can catch 100 percent of bad actors, like, it’s a platform,” Melugin explained. “It’s not in business to catch criminals.” She added that “more resources for law enforcement would, I think, do a lot more good than proposals like age verification for everyone in the whole world to access everything.”
At an August 14 news conference, Louisiana Attorney General Liz Murrill remarked, “We passed laws to keep sexual predators away from our kids, and daycares, and schools, and on playgrounds but there are companies who are facilitating and promoting virtual access that exposes your child, even when you think they’re safe, in you’re home or in their own room.” Her lawsuit accused Roblox of engaging in false and deceptive marketing practices by advertising its site as safe for children; demonstrating negligence by failing to maintain a “reasonable standard of care” for its product; and unjustly enriching itself by profiting, including through its in-game currency, “Robux,” which online users have used to extort child users for explicit content.
“We share the critically important goal of keeping kids safe online and any assertion otherwise is categorically untrue,” a Roblox spokesperson said in a statement shared with TMD. “Many games the lawsuit highlighted were long removed for violating our safety policies and we work constantly to remove violative content and bad actors. And we continue to innovate and add new safety features regularly in an effort to help keep kids safe. We share Attorney General Murrill’s urgency to help keep kids safe because safety has been our priority. We look forward to working with the Attorney General to help keep kids safe in Louisiana and across the country.”
Roblox made changes to improve child user safety on its platform in November 2024, including providing parents with an account to monitor and review their child’s gameplay on the platform, changing the default setting so that users under 13 cannot directly message other players (unless the user or user’s parent removes that limit), and restricting in-game “experiences”—mini-games within Roblox, some of which are created by other users—by allowing users under 9-years-old to only access areas rated “minimal,” “mild,” and, with parental consent, “moderate.” These changes came about one month after investment firm Hindenburg Research released a report calling Roblox a “X-rated pedophile hellscape, exposing children to grooming, pornography, violent content and extremely abusive speech.”
But loopholes remain. De Ionno pointed to Roblox’s censorship of vulgar and sexually explicit speech used in the game’s built-in chat system. While particular words and phrases are automatically blocked in that chat system, users have found ways to easily circumvent this, such as communicating through emojis or replacing letters with similar symbols. As such, De Ionno explained it’s not uncommon to read sexually charged discussions, vulgar language, and racism thrown around in Roblox’s online chatroom.
However, the most significant concern with this loophole is the potential for conversations between users to be moved off Roblox and onto third-party platforms. (For example, a user cannot ask for another player’s “Instagram” handle, but a request may go undetected if spelled “1nst@gram.”)
“That’s the bit that’s concerning,” De Ionno said, “where you’ve got adults potentially making contact with children and then going off onto platforms which are even less observed, and scrutinized, and moderated than Roblox, where basically anything can happen.” Not everything on Roblox is problematic and there are “lots of things on Roblox that are child-friendly,” De Ionno explained, but “it’s … where people can meet each other and then go off-platform.”
Roblox CEO David Baszucki has maintained that the gaming platform remains a safe environment for children and called on parents to learn more about Roblox and make their own determination. “My first message would be, if you’re not comfortable, don’t let your kids be on Roblox,” he told the BBC in March. Melugin reiterated this point: Few people are better positioned to monitor a child’s safety than their parents. “I mean, I’m a parent, and if I’m worried about a platform being unsafe, I’m not going to let my kids spend time on it,” she told TMD. “That’s just it.”
————————————————