Australia probes Roblox and Minecraft over child safety | #childsafety | #kids | #chldern | #parents | #schoolsafey


Australia’s eSafety regulator has asked gaming companies, including Microsoft and Roblox, to explain how they are protecting children from sexual exploitation and radicalisation.

Companies including Roblox, Minecraft, Epic Games’ Fortnite and Valve’s Steam have been issued with legally enforceable transparency notices by the eSafety Commissioner’s office. The notices require details on safety systems, staffing and cybersecurity measures and were issued on Wednesday (22 April).


Julie Inman Grant, the eSafety Commissioner, said gaming platforms are a major social space for children, noting nine in 10 Australians aged eight to 17 have played online games. She added they also carry significant risks.


“What we often see after these offenders make contact with children in online game environments, they then move children to private messaging services,” Inman Grant said in a statement.


Encrypted messaging can become the first point of contact between children and offenders involved in grooming, sexual extortion and radicalisation.


“Predatory adults know this and target children through grooming or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact offending, radicalisation and other off-platform harms,” she added.


Companies must comply with the notices or face penalties and potential civil action.


Microsoft said it is reviewing the regulator’s notice and takes children’s online safety seriously. A spokesperson said via email: “we continue to evolve our approach to meet the evolving threat and regulatory landscape.”


Roblox did not immediately respond to requests for comment.


The move comes amid increased scrutiny of how gaming platforms detect threats to minors, with real-time chats between users harder to moderate than traditional social media.


Roblox lawsuits

Roblox, one of the companies targeted by Australian authorities, is embroiled in legal action. On Tuesday it reached settlements in the U.S. states of Alabama and West Virginia over allegations it failed to protect children, agreeing to pay more than $23 million and make changes to chat and gaming features.


The company is also facing more than 140 lawsuits in U.S. federal courts accusing it of knowingly facilitating child sexual exploitation.


As it grapples with these legal challenges, Roblox last week said it would introduce tailored accounts for younger users from June, assigning children aged five to eight to “Roblox Kids” and users aged nine to 15 to “Roblox Select.”



Read more:


Tags

————————————————


Source link

National Cyber Security

FREE
VIEW