Info@NationalCyberSecurity
Info@NationalCyberSecurity

Instagram, Facebook prioritised money over child Safety, claims report | #childsafety | #kids | #chldern | #parents | #schoolsafey


Meta failed to protect children from sexual exploitation on its platforms, Facebook and Instagram, according to an internal investigation reported by The Wall Street Journal.

Last year, Meta introduced new paid subscription tools, allowing influencers to monetize their content. However, these tools were misused by hundreds of adults to profit from content showing their own children in bikinis and leotards. The content appealed to an audience that was “overwhelmingly male” and demonstrated overt sexual interest in the children.

Meta’s internal reviews found that its algorithms actively promoted such accounts to users with known pedophilic interests. While the images did not constitute child pornography, investigators determined some parents understood the content was catering to adults’ sexual gratification.

“Parents engaged in sexual banter about their own children or had their daughters interact with subscribers’ sexual messages,” the Journal reported.

The internal teams recommended requiring child-focused accounts to register so Meta could monitor them. However, Meta chose only to build a system preventing suspected pedophiles from subscribing, which often failed to work.

Meanwhile, Meta expanded the controversial subscription feature to more markets before implementing the planned safety features.

The Journal’s examination revealed ongoing enforcement failures. One banned parent-run account selling inappropriate content of a teen girl had returned to the platforms and gained hundreds of thousands of followers.

Expand

Meta frequently failed to remove backup Instagram and Facebook profiles used to promote banned content. Men in online forums reposted images of the child models and discussed obtaining more risqué content.

The Journal also provided examples of inappropriate content being monetized via Meta’s “gifts” program.

Meta frequently failed to detect exploitative videos that elicited cash gifts from followers. The company generally collects commissions on these gift payments.

By promoting these monetization features without regard for child safety, Meta allowed the sexual exploitation of minors on its platforms. Despite internal warnings, the company prioritized profit over the wellbeing of children.

Federal legislators and state attorneys general have noted Meta’s repeated child safety failures on its platforms. Last June, Meta formed a task force to address child sexualization, but related efforts have shown limited success.

Meta spokesman Andy Stone defended the company’s actions but said ongoing work is being done to improve safety. Critics argue Meta should have banned child modeling subscriptions altogether, as sites like Patreon and OnlyFans have.

————————————————


Source link

National Cyber Security

FREE
VIEW