Australia’s online safety regulator on Wednesday ordered major online gaming platforms including Roblox, Minecraft, Fortnite and Steam to explain how they are protecting children from sexual predators and radicalization online.
The eSafety Commission said it had issued legally enforceable transparency notices seeking details on safety systems, staffing and moderation practices. Companies that fail to comply could face penalties and civil action.
Most Australian youngsters play online games
Julie Inman Grant, eSafety Commissioner, said online games had become social hubs for young people, with nine in 10 Australians aged eight to 17 playing online games.
She warned that predators use gaming platforms to “make contact with children in online game environments, they then move children to private messaging services.”
Grant said that “predatory adults” target children through “grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalization and other off-platform harms.”
The move comes as Australia steps up efforts to curb online harms to minors after banning under-16s from major social media platforms last year.
However, the online safety watchdog found a “substantial proportion of Australian children” were still scrolling banned platforms three months after the ban.
Roblox accused of failing to protect children
Roblox is facing more than 140 US lawsuits alleging it failed to stop the sexual exploitation of children.
On Tuesday, Roblox agreed to settlements with the US states of Alabama and West Virginia for more than $23 million. A week ago, the company announced tailored accounts for young users.
Edited by: Louis Oelofse
————————————————
