Leading figures in the tech industry, including Meta’s Mark Zuckerberg and X’s Linda Yaccarino, are set to testify in Washington today amid growing concerns about the impact of online activities on children’s mental health and safety.
Politicians argue that major tech companies are falling short in safeguarding children from sexual exploitation, prompting discussions on the need for more stringent laws. Lawmakers have insisted that executives provide insights into the steps taken thus far.
The heads of TikTok, Discord, and Snap are also scheduled to appear, marking a significant occasion as many, including Yaccarino, testify before Congress for the first time.
Yaccarino, Discord’s Jason Citron, and Snap’s Evan Spiegel received subpoenas, legal orders, compelling their presence at the Senate Judiciary Committee hearing. In contrast, Zuckerberg and TikTok CEO Shou Zi Chew volunteered to testify.
Senators Dick Durbin and Lindsey Graham emphasized the demand for action from parents and children when announcing the hearing plans.
The hearing comes on the heels of a former senior staff member at Meta expressing concerns to Congress about Instagram’s inadequate measures to protect teens from sexual harassment. Meta responded by highlighting the implementation of “over 30 tools” aimed at fostering a secure online environment for teens.
Online harms, particularly explicit images of children being shared online, have raised alarms within the Senate Judiciary Committee. Reports of fake images created using artificial intelligence have intensified the concerns. Lawmakers pointed to the increase in such images and cited whistleblower accounts and testimonies from child abuse survivors as additional factors prompting the hearing.
In February 2023, the Senate Judiciary Committee delved into the same topic, leading to a consensus among witnesses and lawmakers that firms should be held accountable. Subsequently, bills like the Kids Online Safety Act (KOSA), recently endorsed by Snapchat, have been introduced.
Big tech companies, facing lawsuits over their handling of child and teen accounts, assert their commitment to addressing the issue. Microsoft and Google have developed tools to help platforms identify and report concerning content to the National Center for Missing and Exploited Children in the US.
Social media platforms themselves have implemented changes to enhance child safety, including parental controls, which manage access and provide insights into children’s online activity. Tools reminding children to limit their platform usage have also been introduced.
Join our Whatsapp channel to get the latest global news updates
Published on: January 31, 2024 15:23:40 IST