An update on our work to tackle Child Sexual Exploitation on X | #childsafety | #kids | #chldern | #parents | #schoolsafey


CEO Linda Yaccarino’s Opening Remarks before the Senate Judiciary Committee on January 31, 2024

Chairman Durbin, Ranking Member Graham, and Esteemed Members of the Committee: Thank you for the opportunity to discuss X’s work to protect the safety of minors online.

Today’s hearing is titled a crisis, which calls for immediate action. As a mother, this is personal, and I share your sense of urgency.

X is an entirely new company and an indispensable platform for the world and democracy. You have my commitment that x will be part of the solution.

While I only joined X in June of 2023, I bring a history of working with governments, advocates, and NGOs to harness the power of the media to protect people. Before I joined, I was struck by the leadership steps this new company was taking to protect children.

X is not the platform of choice for children and teens and X does not have a line of business dedicated to children. Children under 13 are not allowed to open accounts. 

Less than one percent of the US users on X are between 13 and 17. Those users are automatically set to a private default setting and cannot accept a message from anyone they do not approve.

In the last 14 months x has made material changes to protect minors. Our policy is clear – X has zero tolerance towards any material that features or promotes child sexual exploitation.

My written testimony details X’s extensive policies on content or actions that are prohibited and include – grooming, blackmail, and identifying alleged victims of CSE.

We’ve also strengthened our enforcement with more tools, and technology to prevent bad actors from distributing, searching for, or engaging with CSE content across all forms of media.

If CSE content is posted on X, we remove it. Now we also remove any account that engages with CSE content – whether it’s real or computer generated.

Last year, X suspended 12.4 million accounts for violating our CSE policies. This is up from the 2.3 million accounts removed by Twitter in 2022.

In 2023, X sent eight hundred and fifty thousand reports to NCMEC, including our first ever fully automated report. This is 8 times more than prior to the acquisition.

We’ve restructured our Trust and Safety teams to remain strong and agile. We are building a Trust and Safety Center of Excellence in Austin, Texas, to bring more agents in house to accelerate our impact.

We’re applying to the technology coalition’s project lantern to make further industry-wide progress. We’ve also opened up our algorithms for increased transparency.

We want America to lead in this solution.

X commends the Senate for passing the Report Act and we support the Shield Act – it’s time for a federal standard to criminalize the sharing of non-consensual intimate images.

We need to raise the standards across the internet ecosystem, especially for those tech companies that are not here today and not stepping up.

X supports the Stop CSAM Act. The Kids Online Safety Act should continue to progress, and we will continue to engage with it to ensure it protects free speech.

There are two additional areas that require all our attention. First: as the daughter of a police officer, law enforcement must have the critical resources to bring these bad offenders to justice. Second: with artificial intelligence offenders’ tactics will continue to sophisticate and evolve – industry collaboration is imperative here.

X believes that freedom of speech and platform safety can and must coexist. We agree, now is the time for us all to act with urgency. Thank you.

———————————————————————————————————-

 

At X, we have zero tolerance for Child Sexual Exploitation (CSE), and we are determined to make X inhospitable for actors who seek to exploit minors. In 2023, we made clear that our top priority was tackling CSE online. 

As an entirely new company, X has strengthened its policies and enforcement to tackle CSE. We are now taking action on users that distribute this content and also taking immediate action on the networks of users who engage with this horrible content.

While X is not the platform of choice for children and minors – users between 13-17 account for less than 1% of our U.S daily users – we have made it more difficult for bad actors to share or engage with CSE material on X, while simultaneously making it simpler for our users to report CSE content.

In 2024, we will continue to share detailed updates about our investment in this area. We are improving our detection mechanisms to find more reportable content on the platform to report to the National Center for Missing and Exploited Children (NCMEC). Plus, we are also building a Trust and Safety center of excellence in Austin, Texas, to hire more in-house agents so we can keep accelerating our impact.

Here is a comprehensive update on our progress and our continued investment in this critical area.

CSE Actions

In 2023, as a result of our investment in additional tools and technology to combat CSE, X suspended 12.4 million accounts for violating our CSE policies. This is up from 2.3 million accounts in 2022.

Along with taking action under our rules, we also work closely with NCMEC. In 2023, X sent 850,000 reports to NCMEC, including our first ever fully-automated report, over eight times more than Twitter sent in 2022.

Not only are we detecting more bad actors faster, we’re also building new defenses that proactively reduce the discoverability of posts that contain this type of content. One such measure that we have recently implemented has reduced the number of successful searches for known Child Sexual Abuse Material (CSAM) patterns by over 99% since December 2022.

Advanced technology and proactive monitoring

We’re investing in products and people to bolster our ability to detect and action more content and accounts, and are actively evaluating advanced technologies from third-party developers that can enhance our capabilities. Some highlights include:

  • Automated NCMEC reporting: In February 2023, we sent our first ever fully-automated NCMEC CyberTipline report. Historically, every NCMEC report was manually reviewed and created by an agent. Through our media hash matching with Thorn, we now automatically suspend, deactivate, and report to NCMEC in minutes without human involvement. This has allowed us to submit over 50,000 automated NCMEC reports in the past year.
  • Expanded Hash Matching to Videos and GIFs: For the first time ever, we are evaluating all videos and GIFs posted on X for CSAM. Since launching this new approach in July 2023, we have matched over 70,000 pieces of media.
  • Launched Search Intervention for CSE Keywords: CSAM impressions occur more on search than on any other product surface. In December 2022, we launched the ability to entirely block search results for certain terms. We have since added more than 2,500 CSE keywords and phrases to this list to prevent users from searching for common CSE terms.

Stronger partnerships, global cooperation

We are constantly seeking feedback and input from trusted organizations that are aligned in the mission to combat online CSE.

Foundational to our work is our multidimensional partnership with NCMEC, which manages the CyberTipline program, regularly convenes global stakeholders and facilitates actionable feedback from law enforcement that makes us better. Other instrumental partners are the Tech Coalition and WeProtect, alliances that push our innovation and provide critical information sharing on emerging threats and behaviors.

In December 2022, we launched a new product partnership that allows us to take down more violative media than before. Built by Thorn, Safer allows tech platforms to identify, remove, and report child sexual abuse material at scale.

The Internet Watch Foundation, where we’ve been a member since 2014, provides technical signals, trend and threat analysis and member collaboration opportunities that directly impact our ability to remediate CSAM content. International hotline operators like Point de Contact flags alleged illicit content on the basis of reports and its own monitoring. We are an active participant in the Child Protection Lab of the Paris Peace Forum answering the international Call to stand up for children’s rights in the digital environment. 

We’re also evaluating participation in new multi-stakeholder programs and initiatives that can supplement our systems.

Ongoing improvements and regular review

At X, we are more vigilant and aggressive than ever in our enforcement. Our team regularly reviews and implements improvements to the measures we take to combat online child sexual exploitation to ensure their ongoing efficacy and performance. Our increased investment in this area throughout the year has yielded significant, measurable results.

Since April, we’ve increased training for content moderators on the tools and policies for NCMEC reporting. In turn, this has led to a 10x increase in the volume of manually-submitted NCMEC reports, from an average of 6,300 reports per month to an average of 64,000 reports per month from June through November 2023. We are evaluating more sources of potential CSAM than we could before.

Working with law enforcement

Ultimately, it is critical that the bad actors be brought to justice and that law enforcement has the tools and information they need to prosecute these heinous crimes. X cooperates with law enforcement around the world and provides an online portal to submit removal requests, information requests, preservation requests, and emergency requests. Through this channel, law enforcement authorities can submit legal demands for law enforcement investigative purposes or for removal of alleged illicit content. Ongoing dialogue and knowledge sharing with law enforcement is key to achieving our mission.

Looking forward

Our work to stop child sexual exploitation will continue to be our top priority. Until child sexual exploitation is brought to an end, our work will never stop.

In 2024, we will continue our strong investment in this critical area and expand our efforts to educate our users about the importance of helping us to combat child sexual exploitation online. We are committed to making X a place where freedom of expression and users’ safety are not compromised in the public conversation.

Remember, we all have a role to play in protecting children and preventing child sexual exploitation. By working together, we can create a safer world for our children to grow and thrive.

  1. Report suspicious behavior: If you come across any content that you suspect may be related to child sexual exploitation, please report it to X, your local law enforcement, and the National Center for Missing and Exploited Children (NCMEC).
  2. Educate yourself and others: Learn more about the signs of child sexual exploitation and how to prevent it. Share this information with your friends, family, and colleagues to raise awareness and help protect children.
  3. Support organizations fighting child sexual exploitation: Consider donating your time or resources to organizations that work to prevent child sexual exploitation and support survivors.
  4. Be vigilant on social media: Be cautious about the information you share online and monitor your children’s online activity to ensure they are not being exposed to inappropriate content or being targeted by predators.

X Resources:

  • Read our Child Sexual Exploitation Policy
  • Report any content that you think may violate this policy using our Child Sexual Exploitation form
  • In addition to reporting Child Sexual Abuse Material (CSAM), you have the option to report any issues related to Child Safety including Child Sexual Exploitation, grooming, Physical Child Abuse (of a minor) in the X app. Find detailed instructions on how to report on our Help Center.

Public Resources: 

Law Enforcement Resources:

This Tweet is unavailable

————————————————


Source link

National Cyber Security

FREE
VIEW