[ad_1]
The chief executives of TikTok, Meta, X, Snap, and Discord went through the wringer of the Senate Judiciary Committee at a hearing focusing on children’s safety online on Wednesday.
Most had to be dragged there, according to Senate Judiciary Chairman Dick Durbin (D-Ill.) Durbin said Meta CEO Mark Zuckerberg and TikTok CEO Shou Zi Chew were the only executives who showed up voluntarily, and said he’s “disappointed that our other witnesses did not have the same degree of cooperation,” referring to Discord CEO Jason Citron, X CEO Linda Yaccarino, and Snap CEO Evan Spiegel. Citron only responded to his subpoena for the hearing after U.S. marshals were sent to Discord’s headquarters “at taxpayer expense,” Durbin said.
The hearing started with emotional video testimony from parents of children who had either been victims of sexual abuse online or had died by suicide after spending time on the platforms exacerbated their mental health problems. From there the hearing progressed into an increasingly intense set of questions as lawmakers pressed the CEOs about what they deemed to be inadequate safety measures on their platforms.
“They are responsible for many of the dangers our children face online. Their design choices, their failure to adequately invest in trust and safety, the constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk,” Durbin said, adding that Discord, Snapchat, TikTok, and Meta have all failed to protect children. He called TikTok the “platform of choice for predators,” and criticizing Discord and Meta’s Instagram as tools for pedophiles to connect with victims and each other.
‘These companies must be reined in’
The grilling of tech CEOs was a rare moment of bipartisanship in Congress. Sen. Lindsey Graham (R-S.C.) called it a “ray of hope” that Washington wasn’t entirely divided by party lines. He too said the powers of social media needed to be contained.
“Social media companies, as they’re currently designed and operated, are dangerous products,” Graham said. “They’re destroying lives, threatening democracy itself. These companies must be reined in, or the worst is yet to come.”
The hearing comes as congressional leaders attempt to move forward with a raft of bipartisan legislation that would strengthen user protections against sexually explicit content, drug related content, and implement stricter punishment for platforms that violated the proposed laws. Notably, one of these bills seeks to do away with portions of Section 230, which, with few exceptions, shields online companies from civil liability for any content posted on their sites.
Sen. Amy Klobuchar (D-Minn.) was clear that the laws hadn’t passed because of vigorous lobbying from the tech companies. “The reason they haven’t passed is because of the power of your companies,” Klobuchar said. “So let’s be really, really clear about that.”
An analysis by the nonprofit Issue One, which advocates against money in politics, found that the five companies spent a combined $30 million on lobbying last year and hired one lobbyist for every four members of Congress.
The prevailing sentiment among the committee’s senators was that the social media companies had abused the protections afforded to them by Section 230. “Of all the people in America we could give blanket liability protection to, this would be the last group I would pick,” Graham said.
Reactions from the CEOs toward the various pieces of legislation were mixed. Zuckerberg said he didn’t support the bills themselves, even though he was in favor of stopping the issues they addressed. Instead, he sought to shift the responsibility to app stores, claiming they should be the ones to enforce parental controls. Graham grilled Citron about whether he supported any of the five bills. When Citron attempted to answer, Graham asked him to answer with a yes or no answer. Citron instead provided longer answers before being cut off by Graham.
X: Less than 1% of users are kids
However, there were a few exceptions. Earlier this week Spiegel and Snap became the first to support the Kids Online Safety Act, which the social media trade association NetChoice opposes. During the hearing Spiegel also said he supports the Cooper Davis Act, which requires social media platforms to report drug sales online to law enforcement. Yaccarino told the committee X favored the Stop CSAM bill, becoming the first social media company to do so publicly. In her opening statement she also sought to minimize the number of teen users present on X.
“X is not the platform of choice for children and teens. We do not have a line of business dedicated to children,” Yaccarino said. “Children under the age of 13 are not allowed to open an account. Less than 1% of the U.S. users on X are between the ages of 13 and 17.”
X pledged to hire 100 full-time content moderators in a new “Trust and Safety center of excellence” in Austin to help enforce its content and safety rules, Bloomberg reported last week, just ahead of Wednesday’s Senate hearing. Meta released its own legislative proposal that kept some provisions intact but featured a watered down version of its enforcement mechanisms.
This timing was not lost on Durbin.
“Coincidentally, several of these companies implemented common-sense child safety improvements within the last week, days before their CEOs would have to justify their lack of action before this committee,” he said.
TikTok CEO Shou Zi Chew also took a jab, saying,”Let me give you a few examples of longstanding [safety] policies…we didn’t do them last week.” On TikTok, users under the age of 16 have their profiles automatically listed as private and can’t receive direct messages from anyone they don’t follow or already have a connection with. All users who are minors will automatically have a one-hour screen time limit on the app, which they must enter a passcode to bypass. Many of the other companies had similar features as well.
Many of the efforts at self regulation were met with skepticism from lawmakers who said they had thus far yielded few results. When Citron listed Discord’s “wide array of techniques” used to limit explicit images of minors, Durbin interrupted him to say, “Mr. Citron, if that were working we wouldn’t be here today.”
Committee members said that, because of Section 230, the companies testifying faced few consequences for the harmful effects the most exploitative and dangerous content on their sites had on teenagers. Without legal recourse, victims couldn’t seek compensation and social media companies couldn’t be held accountable, Graham argued. “Nothing will change until the courtroom is open to victims of social media,” he said.
[ad_2]