Social Media CEOs Brace for Senate Spotlight on Child Safety: What to Expect | #childsafety | #kids | #chldern | #parents | #schoolsafey

[ad_1]

(Bloomberg) — Some of the tech industry’s most prominent and powerful leaders will descend on Capitol Hill Wednesday for a Senate hearing focused on protecting children online.

Most Read from Bloomberg

Chief executive officers from Meta Platforms Inc., X (formerly Twitter), Snap Inc., TikTok and Discord will provide testimony and take questions from members of the Senate Judiciary Committee, which has supported several bills related to kids’ digital safety. Congress has increasingly scrutinized social media platforms as growing evidence suggests that excessive use and the proliferation of harmful content may be damaging young people’s mental health.

Several bipartisan proposals seek to hold tech companies accountable, strengthen protections for young users and stop children’s sexual exploitation online. Yet a myriad of tech trade groups and civil liberties organizations have criticized many of the proposed measures as flawed and counterproductive, arguing they would worsen online privacy and safety if advanced. A handful of social media companies, including TikTok, which is owned by ByteDance Ltd., and Meta, are facing lawsuits in California that claim the companies were negligent and ignored the potential harms their platforms created for teens.

Questions from elected officials rarely stay hyper focused on the topic at hand when tech CEOs visit Washington, especially during hearings related to online content moderation. It’s possible that Wednesday’s session will end up being wide-ranging. Here’s a look at who will be testifying from each company, and what they are likely to discuss. Read more: Meta, X, Tiktok Face Senators’ Scrutiny Over Kids’ Online Safety

Mark Zuckerberg — Meta

Meta, which owns social networking apps Facebook and Instagram, has faced significant pushback over the years for its child safety practices. Investigations by whistleblowers, news organizations and academic researchers have found the company’s sites can harm the mental health of young users and connect networks of predators to child-sex content. In October, more than 30 states sued Meta, alleging its social media apps were feeding harmful content to youth. Meta had plans in 2021 to create a version of Instagram for kids under 13, but later scrapped those plans after pushback related to the impact of Instagram on teens’ mental health.

CEO Mark Zuckerberg, who has testified extensively before Congress in the past but has recently steered away from policy issues, will focus on the company’s efforts to improve child safety. Earlier this month the company announced plans to tighten default messaging settings for teens on Instagram and Facebook, and to restrict teens from seeing age-inappropriate content. Meta has advertising policies that prohibit marketers from showing certain types of ads to teenagers, or targeting them based on certain factors, like gender or their on-network activity.

Linda Yaccarino — X

Linda Yaccarino will make her first appearance before Congress as X’s CEO, a role she took on last June. The former advertising boss at NBCUniversal, Yaccarino has spent her first eight months on the job trying to win back advertisers and convince skeptics that X’s owner, Elon Musk, still cares about policing the social network. Musk has spoken or tweeted several times about the importance of protecting children online, making it a key part of the company’s public campaign to regain user trust and approval.

Yaccarino was in Washington this week to meet senators ahead of Wednesday’s hearing and talk about the company’s efforts to fight content related to child sexual exploitation. During meetings, she also stressed that X is a completely different company than its predecessor, Twitter, and it’s likely she’ll try to further distance X from its former self during the hearing by highlighting differences in content policies or strategy. X announced over the weekend that it will build a new Trust and Safety center in Austin with employees focused primarily on combatting content related to child sexual exploitation. It’s also likely Yaccarino will face questions about the rise of AI-generated content, including explicit content, and its dissemination on X. Last week, explicit images generated to look like Taylor Swift circulated on X for hours before they were removed, garnering millions of views and raising questions about the platform’s ability to quickly moderate abusive and illegal posts.

Evan Spiegel — Snap

CEO Evan Spiegel oversees Snapchat, an app that’s popular with teens centered more on person-to-person messaging than posting publicly. But that hasn’t insulated it from criticism. Snap is facing a California lawsuit brought by families who allege their children died from overdoses after purchasing drugs through the app.

In 2022, Snap introduced a function that lets parents or caretakers see certain activity on their kid’s account and implement controls, for example over whether their child is allowed to engage with the company’s AI chatbot. Last year, the app also imposed a strike system for accounts that post content publicly to Stories or Spotlight that’s inappropriate for teens.

Ahead of Wednesday’s hearing, Snap was the first tech company to endorse the Kids Online Safety Act, opposing industry trade group NetChoice’s stance on the bill.

Shou Chew — TikTok

CEO Shou Chew returns to Congress almost a year after his first solo testimony before the House. Then, he was questioned on child safety concerns around TikTok’s addictive nature, and content promoting eating disorders, drug sales and sexual exploitation. Chew, who faced a confrontational audience at that hearing, argued that these issues are not unique to TikTok.

Last year, the company introduced a pre-set time limit of one hour for users under 18 before the app prompts for a passcode to continue watching. Users who say they are 13- to 15-years-old have private accounts by default and cannot send messages. Like other apps, TikTok has a dashboard that can share usage information with parents and caregivers.

Chew could also get questions on the company’s relationship with China via parent company ByteDance. Senators may also ask about recent hot-button issues unrelated to children, including perceived biases when it comes to conflicts like the Israel-Hamas war, as well as the proliferation of AI-generated videos.

Jason Citron — Discord

Originally a chat app for gamers, Discord has been implicated in several high-profile investigations involving child predation, extremism and even terrorism. Today, Discord is mainstream among millennials and Gen Z for day-to-day communication with friends and online acquaintances. In 2021, the company reported 150 million monthly active users and even explored an acquisition by Microsoft for $12 billion.

With its increased popularity comes high levels of abuse. Between 2021 and 2022, instances of child sexual exploitation on the platform increased nearly sixfold to 169,800, according to data from the National Center for Missing and Exploited Children. That’s 73% higher than X, although the increase is also due in part to better detection methods.

CEO and co-founder Jason Citron will represent the company before the committee and discuss initiatives to protect children on the platform. That will likely include its new, open-source model for detecting novel child abuse and work with the cross-platform child safety organization Lantern. Late in 2023, Discord rolled out new features for teens and families to better control their online experience.

–With assistance from Aisha Counts and Oma Seddiq.

Most Read from Bloomberg Businessweek

©2024 Bloomberg L.P.

[ad_2]

————————————————


Source link

National Cyber Security

FREE
VIEW