Info@NationalCyberSecurity
Info@NationalCyberSecurity

Meta Will Restrict Content For Teens Around Self-Harm, Eating Disorders Amid Child Safety Lawsuits | #childsafety | #kids | #chldern | #parents | #schoolsafey


Topline

Meta said Tuesday it’s adding protections to teen users’ accounts to hide age-inappropriate content from search results and explore pages on Instagram and Facebook, and notifying teens to update their privacy settings, amid lawsuits from states over child safety and an upcoming hearing before the Senate.

Key Facts

Meta will remove content around self-harm, suicide, and eating disorders from teen users on Instagram and Facebook, even if it’s shared by a user they follow, adding that while content on self-harm “can help destigmatize these issues,” “it’s a complex topic and isn’t necessarily suitable for all young people.”

Additionally, when teen users search for terms related to self-harm, suicide, and eating disorders, Meta will hide related results and redirect the search to a helpline or other resources for users to seek support.

Teen users on Instagram and Facebook will be “automatically” placed under the most restrictive content control setting to make it more difficult for their accounts to find potentially age-inappropriate content or accounts.

Meta will also start notifying teen users to update their privacy settings, and if teens choose “recommended settings,” their accounts will restrict who can repost, tag, mention, or include their content in Reels Remixes, as well as block users who don’t follow them from messaging them and hide offensive comments.

While some of the restrictions are already applied to new teen users who sign up for Instagram and Facebook, Meta said the additional protections will roll out to all teen users in the coming weeks and months.

Key Background

In October, Meta was sued by 33 states for allegedly misleading the public over features that made Facebook and Instagram “addictive” to children, and was accused of trying to generate profit through building platforms that “entice, engage and ultimately ensnare youth and teens.” The lawsuit claimed Meta has “profoundly altered the psychological and social realities for a generation of young Americans.” In December, Meta announced it was working on technology to fight against online pedophiles who use its platforms to exploit children, amid reports that Facebook and Instagram were enabling child predators and pushing inappropriate content about children. The company also said it was creating a task force of specialists to look over its existing child safety policies to improve its system for flagging inappropriate content, groups, and pages. Meta also faced a lawsuit from New Mexico in December which claimed the company allowed its platforms “to become a marketplace for predators in search of children upon whom to prey.”

What To Watch For

Meta CEO Mark Zuckerberg will be joined by other high-profile tech executives including X CEO Linda Yaccarino and TikTok CEO Shou Zi Chew in a hearing before the Senate Judiciary Committee on Jan. 31. The executives will testify about how the platforms they lead have failed “to protect children online,” according to Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.).

Further Reading

Meta Sued By 33 States Over ‘Substantial Dangers’ For Kids On Instagram And Facebook (Forbes)

Meta Says It’s Stepping Up Fight Against Online Predators Amid Reports Of Child Exploitation (Forbes)

Meta Sued By New Mexico For Allegedly Creating ‘Marketplace For Predators’ (Forbes)

Follow me on Twitter or LinkedIn. Send me a secure tip.

————————————————


Source link

National Cyber Security

FREE
VIEW