A set of bills that City Council members are touting as a way to hold social media companies accountable for harming children’s mental health is facing criticism from some attorneys, who say the legislation would open the city up to lawsuits for civil rights violations.
The bills, introduced by Council Members Althea Stevens (D-Bronx) and Nantasha Williams (D-Queens), would limit social media usage to one hour a day for those under 17 years old and would require reports on the online activity of people under 24 years old who attend Department of Youth and Community Development programming and get into in-person altercations.
“We are here today to finally hold the billionaire social media CEOs accountable for … pushing content that forces children to give them their attention,” said Council Member Shekar Krishnan (D-Queens) at a rally outside City Hall ahead of a Tuesday hearing on the bills. “Increasingly, that content is violent news, dangerous trends and fueling feelings of isolation and anxiety.”
He was joined by City Council Speaker Julie Menin, and Council Members Tiffany Caban (D-Queens) and Stevens, who cited studies showing social media has harmed children’s mental health and told reporters students have told them they want protections from those harms.
However, Justin Harrison, senior policy counsel at the New York Civil Liberties Union, said he believed the bill would expose the city to lawsuits from social media companies because it would violate both their and children’s First Amendment rights.
“The bill has significant constitutional problems,” Harrison said. “Teenagers have First Amendment rights, and that includes a right to browse the internet, a right to consume information, a right to find friends, find community and find networks online. This sort of blanket, one-size-fits-all approach that the city appears to be taking, limiting access to all the same content for all minors, is really too broad a measure.”
Harrison added that having to prove to a company how old you are online is a significant privacy and surveillance issue that courts are still debating.
He also raised concerns that the level of monitoring of people’s social media activity necessary to report on what those under 24 who get into altercations are doing on social media would be unconstitutional. Council Member Caban said she shared those concerns during Tuesday’s hearing, and some advocates who testified before the council said they were concerned it could result in increased policing of youth.
Harrison and New York Law School professor Michael Goodyear also raised questions over the practicality of enforcing the laws. Both said that, because children could theoretically trick social media companies into thinking they were in a different geographic location by using a different VPN or uploading a photo of fake identification to suggest they were adults, it would make it difficult for social media companies to be in compliance and avoid penalties in ways that wouldn’t be completely their fault.
Goodyear also said that social media companies facing different restrictions in different states and cities could make it hard for them to operate effectively in what he referred to as a “patchwork system” of enforcement.
“We also have to think about how this interplays with the state laws … as well as the patchwork of laws that are emerging across the country,” Goodyear said. “That does make it increasingly difficult for social media companies to fulfill their job and provide social media products that are effective for consumers, if they have to comply with 50 different state laws and know how many different city laws.”
The fight to control social media
The council is certainly not the first to argue that social media companies’ negligence has caused depression and anxiety in users, particularly youth, and to take action against them on those grounds. Just last month, social media companies Meta and Google were found liable for causing depression and anxiety in users in a landmark California case, and Meta was found liable for failing to protect kids from exploitation in New Mexico.
There are currently about 4,000 pending cases alleging social media companies are liable for mental or physical harm to users, NPR reports. Many are making arguments similar to those made about the tobacco industry in the 1990s: that social media companies knew about the risks to children and users generally, but negligently did not warn or provide sufficient protection measures for users.
Notably, the New York Court of Appeals is currently considering a wrongful death case where a mother is arguing Meta and ByteDance, which owns TikTok, are liable for her son’s death because videos he saw on social media platforms pushed him to take part in subway surfing, which ended his life.
New York City and State have taken a litany of enforcement and regulatory actions against social media companies in recent years. The state passed two 2024 laws, the SAFE for Kids and Child Data Protection Acts, which prevent social media companies from providing algorithm-based content feeds to kids and collecting their data, and the city filed a suit against Meta and in California federal court last year.
However, Harrison said laws limiting kids’ social media usage wouldn’t necessarily prevent people from suing over harms caused by social media platforms — or tech giants from litigation.
Stevens said Tuesday she was open to feedback on the bills and would look to continue amending them to accomplish her goal of protecting children’s mental health from social media companies.
