How concerned are you about the dangers of social media and the internet? In your opinion, are technology companies doing enough to protect children online? If not, should Congress step in and pass legislation to address the problem?
Last week, in a contentious congressional hearing, U.S. senators grilled executives from Meta, X, TikTok, Discord and Snap, telling the leaders they had “blood on your hands” and calling on them to apologize for their companies’ role in harming children.
Lawmakers on Wednesday denounced the chief executives of Meta, TikTok, X, Snap and Discord, accusing them of creating “a crisis in America” by willfully ignoring the harmful content against children on their platforms, as concerns over the effect of technology on youths have mushroomed.
In a highly charged 3.5-hour hearing, members of the powerful Senate Judiciary Committee raised their voices and repeatedly castigated the five tech leaders — who run online services that are very popular with teenagers and younger children — for prioritizing profits over the well-being of youths. Some said the companies had “blood on their hands” and that users “would die waiting” for them to make changes to protect children. At one point, lawmakers compared the tech companies to cigarette makers.
“Every parent in America is terrified about the garbage that is directed at our kids,” Senator Ted Cruz, Republican of Texas, said.
The tech chiefs, some of whom showed up after being forced by subpoena, said they had invested billions to strengthen safety measures on their platforms. Some said they supported a bill that bolsters privacy and parental controls for children, while others pointed to the faults of rivals. All of the executives emphasized that they themselves were parents.
In one blistering exchange with Senator Josh Hawley, Republican of Missouri, Mark Zuckerberg, Meta’s chief executive, stood up and turned to address dozens of parents of online child sexual exploitation victims.
“I’m sorry for everything you have all been through,” Mr. Zuckerberg said. “No one should go through the things that your families have suffered.” He did not address whether Meta’s platforms had played a role in that suffering and said the company was investing in efforts to prevent such experiences.
The bipartisan hearing encapsulated the increasing alarm over tech’s impact on children and teenagers. Last year, Dr. Vivek Murthy, the U.S. surgeon general, identified social media as a cause of a youth mental health crisis. More than 105 million online images, videos and materials related to child sexual abuse were flagged in 2023 to the National Center for Missing and Exploited Children, the federally designated clearinghouse for the imagery. Parents have blamed the platforms for fueling cyberbullying and children’s suicides.
The issue has united Republicans and Democrats, with lawmakers pushing for a crackdown on how Silicon Valley companies treat their youngest and most vulnerable users.
What is your reaction to the Senate Judiciary Committee hearing on online child safety? How important is the issue to you? Is it a topic you regularly discuss with your friends, family or teachers?
How justified were the accusations senators leveled at tech executives? Do you think social media companies are prioritizing profits over the health and well-being of young people? Are tech leaders willfully ignoring the harmful content against children on their platforms? Or have their efforts so far, such as investments in safety measures, proved they are taking the issue seriously?
What is your reaction to the decision by Mark Zuckerberg, Meta’s chief executive, to stand and address abuse victims’ families in the hearing room and tell them, “I’m sorry for everything you have all been through”? Should all tech executives issue similar apologies? Is that enough? What should tech companies do to address the rising fears that their platforms are harming young people?
The Senate Judiciary Committee discussed several child safety bills directed at the tech platforms before the hearing. One law, the Kids Online Safety Act, would create a legal duty for certain online platforms to protect children. Digital rights groups like the Electronic Frontier Foundation have criticized some of the legislative proposals, which they say could encourage the platforms to take down legitimate content while the companies try to comply with the laws. What do you think of such legislation? Which of the various proposed bills, if any, do you think would be most effective?
What other options or possible solutions exist, in your view, and why? What role should parents or children themselves play? If you were in charge of a social media company, what actions to protect young people, if any, would you take? If you were a member of Congress, what steps would you take?
What is missing in this debate? What do you think adults — parents, teachers, tech company executives, political leaders — should know about online safety and how young people use the internet that they might not understand?