New Mexico accuses Meta of putting profits over child safety as trial opens | #childsafety | #kids | #chldern | #parents | #schoolsafey


(CN) — The state of New Mexico opened a projected 8-week trial on Monday, claiming Meta made false statements about its efforts to improve child safety and remove bad actors on its main social media platforms Instagram and Facebook.

State attorney Donald Migliori told a panel of 12 jurors and six alternates in a Santa Fe County courtroom that Meta’s algorithms intentionally promote sexually exploitative or otherwise harmful content to minors and prolong the time teens spend on the apps to maximize advertising revenue.

“The theme throughout this trial is going to be that Meta puts profits over safety,” the Motley Rice partner said Monday morning.

Migliori told the jury Meta knowingly overestimates the efficacy of its teen safety features. Meta has claimed in public hearings that children are safe from adults sending them explicit material, safe from sexual exploitation or trafficking, and safe from other content that promotes self harm, suicide and eating disorders, Migliori said. Internal Meta documents tell a different story.

“Internally, Meta clearly knew that youth safety is not a corporate priority,” he said. “Safety measures were underresourced, ineffective and deprioritized. Meta has acted with total disregard, making its conduct unconscionable.”

Migliori presented the jury with a handful of statements made by Meta and its CEO, Mark Zuckerberg, comparing them to Meta’s internal documents.

In 2018, Zuckerberg told Congress that Meta prioritizes child safety above everything else. In an internal memo titled “Safety First,” Zuckerberg said he can’t prioritize safety over free expression.

“Keeping people safe is the counterbalance, not the main point,” Zuckerberg wrote in the document shared with the jury.

Zuckerberg also told Congress that Meta doesn’t allow children younger than 13 on its platforms, but internal data indicates that more than 4 million Instagram users are younger than 13.

A 2018 document presented by Migliori indicated that Meta was aware teen accounts experienced a higher rate of exposure to content that violates Instagram’s safety guidelines than adult accounts. A 2020 memo reported inappropriate interactions with children were “through the roof,” with one employee writing that Meta had not been fulfilling its legal obligation to report child exploitative images to the National Center for Missing and Exploited Children.

Migliori added that while Meta does report to NCMEC, it still takes insufficient enforcement action on its platforms.

Representing the tech conglomerate, Kellogg Hansen attorney Kevin Hugg said Meta has always been transparent about its efforts to protect children and the reality those efforts can never be 100% effective.

“Meta tells the world what the policies are so everyone knows the rules,” Huff told the jury. “Meta publicly admits that some of this content gets past its safeguards.”

He said Meta warns all users on its platforms that they may be inadvertently exposed to sexually exploitative or otherwise harmful material.

“That’s disclosure, not deception,” he said.

Among its safety features, Meta touts its teen account setting for children 13 to 17, which can limit what content users can see, whether strangers can interact with them and how much time they spend on the app. Huff said Meta implemented those protections knowing it would cut teen traffic by 3.3%.

Huff said the state’s “profit over safety” argument falls flat because Meta has a financial incentive to keep the app safe. The more positive experiences, the more future engagement, he said.

Because Meta is aware that children younger than 13 can give a false age to gain access to the platforms, Huff said the company provides forms to report underage users and deploys automated tech that detects children who may be too young.

“Are these tools perfect?” Huff asked. “Of course not. But they are constantly improving. And Meta does not hide that fact from the public either.”

New Mexico Attorney General Raúl Torrez filed suit in 2023, accusing Meta of creating a marketplace and “breeding ground” for predators who target children for sexual exploitation and failing to disclose what it knew about those harmful effects. Meta says the state cherrypicked data out of billions of daily posts across its platforms to sensationalize a relatively small issue.

Zuckerberg was dismissed in his personal capacity, but has been deposed and may still testify in the trial.

The lawsuit was built largely on an undercover investigation in which state actors used fake accounts posing as children to document sexual solicitation from other users and observe Meta’s response.

In his opening, Huff questioned the investigation’s ethics.

“The state designed the investigation to intentionally circumvent Meta’s safeguards to make Facebook and Instagram look more dangerous than they really are,” he said.

Investigators made accounts to resemble children but told the app they were adults to avoid the teen safety features. The Investigators used photos of real underage girls to lure known pedophiles into engaging with the accounts on Instagram and Facebook.

“The state put real people at risk,” Huff said.

Meta also faces a multistate lawsuit in California on claims its products are designed to be addictive and harmful to children and teens. Snapchat and TikTok settled in January, but Meta and Google remain defendants.

Subscribe to our free newsletters

Our weekly newsletter Closing Arguments offers the latest about ongoing
trials, major litigation and rulings in courthouses around the U.S. and the world,
while the monthly Under the Lights dishes the legal dirt from Hollywood,
sports, Big Tech and the arts.

————————————————


Source link

National Cyber Security

FREE
VIEW