A New Mexico court has ordered Meta to pay $375 million for deceiving users about the safety of its children’s platforms.
A jury found that Meta, which owns Facebook, Instagram and WhatsApp, was responsible for the way its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators, reports the Telegraph.
New Mexico Attorney General Raul Torrez said the decision is “historic” and marks the first time a state has successfully sued Meta over child safety issues.
A spokeswoman for Meta, led by chairman and CEO Mark Zuckerberg, said the company disagrees with the decision and intends to appeal.
“We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors and harmful content. We remain confident in our record of protecting teens online,” she said.
The jury found that Meta was liable for violating the New Mexico Unfair Practices Act because it misled the public about the safety of its platforms for new users.
During a seven-week trial, the jury was presented with internal Meta documents and heard testimony from former employees about how the company had been aware of child predators using its platforms.
Arturo Béjar, a former engineering lead at Meta who left the company in 2021 and became a whistleblower, testified about various experiments he conducted on Instagram that showed underage users were being offered sexualized content.
He said his young daughter was propositioned for sex by a stranger on Instagram.
State prosecutors pointed to an internal Meta research that, at one point, found that 16% of all Instagram users had reported being exposed to unwanted nudity or sexual activity in a single week.
Meta argued that it has worked over the years to combat problematic users of its platforms and promote safe experiences for minors.
In 2024, Instagram launched Teen Accounts, giving younger users more ways to control their experience. Just last month, it launched a feature that would notify parents if their children were browsing self-harm content.
The total civil penalty of $375 million was reached after the jury ruled that there were thousands of violations of the act, each with a maximum fine of $5,000.
Meta is also involved in a separate trial in Los Angeles, in which a young woman claims she became addicted to platforms like Instagram and YouTube, owned by Google, as a child because of the way they are intentionally designed.
There are thousands of similar lawsuits going through American courts.
New Mexico sued Meta in 2023, alleging that the company “directed” young users to content that was sexually explicit, depicted child sexual abuse, or even exposed them to seeking such material and sex trafficking.
It said the company did this through its recommendation algorithms, which are essentially tools that Meta uses to automatically curate the content a user sees on its platforms.
“Meta’s executives knew their products were harming children, ignored warnings from their employees, and lied to the public about what they knew,” Torrez said.
“Today the jury joined families, educators and child safety experts in saying enough is enough,” he added. /Telegraph/
————————————————
