Meta faces another lawsuit over child safety | #childsafety | #kids | #chldern | #parents | #schoolsafey

[ad_1]

New Mexico is the latest jurisdiction to accuse of failing to protect younger users. The state attorney general’s office filed suit against the company this week after investigators set up test accounts on and in which they claimed to be preteens or teenagers. They used AI-generated profile photos for the accounts. The AG’s office asserts that the accounts were barraged by explicit messages and images, along with sexual propositions from users. It also claimed that Meta’s algorithms recommended sexual content to the test accounts.

The suit claims that “Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey,” according to . In addition, it asserts that Meta failed to employ measures to stop those under 13 from using its platforms and that CEO Mark Zuckerberg was personally liable for product choices that increased risks to children.

To get around Meta’s age restrictions, investigators provided the company with adult dates of birth while setting up phony accounts for four children (kids often to access online services that they’re not supposed to). However, they implied that the accounts were being used by children — one posted about losing a baby tooth and starting seventh grade. Per the suit, investigators also set up the account to make it seem as though the fictional child’s mother was possibly trafficking her.

The suit alleges that, among other things, the accounts were sent child sex images and offers to pay for sex. Two days after investigators set up an account for a phony 13-year-old girl, Meta’s algorithms suggested it follow a Facebook account with upwards of 119,000 followers that posted adult porn.

Investigators flagged inappropriate material (including some images that appeared to be of nude and underaged girls) through Meta’s reporting systems. According to the suit, Meta’s systems often found these images to be permissible on its platforms.

In a statement to the Journal, Meta claimed it prioritizes child safety and invests heavily in safety teams. “We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” the company said. Meta has also claimed that it carries out work to stop malicious adults from contacting children on its platforms.

Earlier this year, Meta to tackle child safety issues after reports indicated Instagram’s algorithms helped accounts that commissioned and bought underage-sex material to find each other. Just last week, the Journal the alleged prevalence of child exploitation material on Instagram and Facebook. According to the Canadian Centre for Child Protection, a “network of Instagram accounts with as many as 10 million followers each has continued to livestream videos of child sex abuse months after it was reported to the company.” Meta says it has taken action over such issues.

The New Mexico lawsuit follows suits that a group of 41 states and the District of Columbia Among other matters, they alleged that the company knew its “addictive” aspects were harmful to young users and that it misled people about safety on its platforms.

[ad_2]

————————————————


Source link

National Cyber Security

FREE
VIEW