Elon Musk’s X fined $380K over “serious” child safety concerns, watchdog says | #childsafety | #kids | #chldern | #parents | #schoolsafey

Today, X (formerly known as Twitter) became the first platform fined under Australia’s Online Safety Act. The fine comes after X failed to respond to more than a dozen key questions from Australia eSafety Commissioner Julie Inman Grant, who sought clarity on how effectively X detects and mitigates harms of child exploitation and grooming on the platform.

In a press release, Inman Grant said that X was given 28 days to appeal the decision or pay the approximately $380,000 fine. While the fine seems small, the reputational ding could further hurt X’s chances of persuading advertisers to increase spending on the platform, Reuters suggested. And any failure to comply or respond could trigger even more fines—with X potentially on the hook for as much as $493,402 daily for alleged non-compliance dating back to March 2023, The Guardian reported. That could quickly add up to tens of millions if X misses the Australian regulator’s deadline.

“If they choose not to pay, it’s open to eSafety to take other action or to seek a civil penalty through the courts,” Inman Grant told the Sydney Morning Herald. “We’re talking about some of the most heinous crimes playing out on these platforms, committed against innocent children.”

While eSafety has reported that all the major tech companies—including Meta, Apple, Microsoft, Skype, Snap, Discord, TikTok, Twitch, X, and Google—have “serious shortfalls” when it comes to tackling child sexual abuse materials (CSAM) and grooming, X’s non-compliance “was found to be more serious.”

In some cases, X left responses “entirely blank,” Inman Grant reported, and in others, X provided inaccurate information. The report explained:

Twitter/X did not respond to a number of key questions including the time it takes the platform to respond to reports of child sexual exploitation; the measures it has in place to detect child sexual exploitation in livestreams; and the tools and technologies it uses to detect child sexual exploitation material. The company also failed to adequately answer questions relating to the number of safety and public policy staff still employed at Twitter/X following the October 2022 acquisition and subsequent job cuts.

X did not respond to Ars’ request to comment.

In February, when the Australian watchdog first issued then-Twitter a compliance notice, Twitter Safety boasted on the platform that Twitter was “moving faster than ever to make Twitter safer and keep child sexual exploitation (CSE) material off our platform.” Last month, that account—which is now called X Safety—posted that “there is no place in this world or on X for the abuse of children,” claiming that “over the past year we have strengthened our policies, deployed new automated technology, and increased the number of cybertips we send to” the National Center for Missing and Exploited Children.

That post also said that X has taken “action on five times as much content” as the platform did in 2022, noting that “95 percent of the accounts we suspend we find before any user reports,” which was “up from 75 percent.” Australia’s report clarified that before Musk acquired Twitter, the platform was proactively detecting 90 percent of CSAM, but after mass layoffs, the amount of proactive CSAM detection fell to 75 percent, and X failed to specify to eSafety how much it has improved since then.


Source link

National Cyber Security