Zuckerberg among tech bosses to testify on child safety | #childsafety | #kids | #chldern | #parents | #schoolsafey


  • By Tom Gerken & Liv McMahon
  • Technology reporters

52 minutes ago

Image source, Getty Images

Tech bosses including Meta’s Mark Zuckerberg and Linda Yaccarino of X are due to testify in Washington today, as concerns rise about children’s mental health and safety online.

Politicians say big tech companies are not doing enough to protect children from sexual exploitation.

They have been debating tougher laws, and demanded executives appear to explain what has been done so far.

The heads of TikTok, Discord and Snap are also due to attend.

It marks the first time many of the executives, including Ms Yaccarino, have testified before Congress.

Ms Yaccarino, as well as Discord boss Jason Citron and Snap chief Evan Spiegel, received subpoenas – a legal order – before agreeing to appear at the Senate Judiciary Committee hearing. Mr Zuckerberg and TikTok chief executive Shou Zi Chew voluntarily agreed to testify.

“Parents and kids demand action,” Senators Dick Durbin and Lindsey Graham said when they announced the plans.

At the time, Meta said it had brought in “over 30 tools” to support a safe environment for teens online.

Online harms

The Senate Judiciary Committee held a hearing in February 2023 on the same topic, in which witnesses and lawmakers agreed firms should be held to account.

Legislators have since brought forward bills such as the Kids Online Safety Act (KOSA) – which was recently backed by Snapchat.

The committee is believed to be particularly concerned about reports of explicit images of children being shared online, including fake images created using artificial intelligence.

US lawmakers said there had been a rise in such images, and have cited evidence given by whistleblowers and testimonies from child abuse survivors as other reasons for the hearing.

Big tech companies, some of which are also facing lawsuits over their approach to child and teen accounts, have said they are working to address the issue.

Microsoft and Google have developed tools to help platforms identify and report such content to the National Center for Missing and Exploited Children in the US.

And the social media platforms themselves have made several changes to increase child safety online.

For example, many have implemented parental controls which can control access or even show parents how much time their children are spending on social media, while others have tools which remind children to stop using the platform after a little while.

Other systems used by firms to protect children online include hiding harmful content – such as self-harm – from social media feeds, and restricting adults from sending direct messages to children.

None have stopped the clamour from politicians and the public for big tech firms to be held up to further scrutiny however – a fact some of the biggest names in technology are probably about to be reminded of.

————————————————


Source link

National Cyber Security

FREE
VIEW