Heading into 2024, there is growing momentum behind litigation in the US targeting major social media firms. The platforms are facing scrutiny for their design and business practices, which some argue are causing harm to the mental and physical well-being of young users. This review focuses on these legal actions and places them in the context of ongoing policy discussions and legislation intended to protect children’s online safety. It explores:
- Various legal strategies to hold social media companies accountable for youth harms online, including suits brought by states attorney generals against Meta, and suits brought by families and school districts against Google, Meta, Snap, and TikTok;
- The Federal Trade Commission’s effort at COPPA enforcement against major social media companies;
- The evolution of state and federal child online safety legislation.
While the litigation process is often slow and winding, these developments could pose significant challenges for social media companies in 2024.
On Oct. 24, 2023, a bipartisan coalition of 42 state attorneys general filed ten distinct lawsuits against Meta, alleging the company knowingly designed and deployed harmful features across its social media platforms that purposefully addict children and teens, and according to some of the filings, did so in violation of the Children’s Online Privacy Protection Act (COPPA).
The suit most widely reported on was filed in the US District Court for the Northern District of California and includes a coalition of 33 states. These include Arizona, California, Colorado, Connecticut, Delaware, Georgia, Hawaii, Idaho, Illinois, Indiana, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Missouri, Nebraska, New Jersey, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Virginia, Washington, West Virginia, and Wisconsin. An additional nine suits were filed by attorneys general in their respective state courts, including Florida, Massachusetts, Mississippi, New Hampshire, Oklahoma, Tennessee, Utah, and Vermont, in addition to the District of Columbia.
Many attorneys general in the suits claim Meta violated their respective state’s unfair or deceptive trade practices law or consumer protection acts.
The 33 state attorneys general mainly allege, however, that Meta had “actual knowledge” it was collecting data from millions of users under 13, in direct violation of COPPA. The initial filing, which was heavily redacted and criticized by some as a legal longshot, was unsealed on Nov. 25, 2023, revealing a number of ways Meta allowed underage users to remain active on Facebook and Instagram, explicitly targeted their products towards children, and buried internal research that established how certain features harm young users.
Angela Campbell, a Professor Emerita at Georgetown Law who was involved in the effort to pass COPPA, says that the suits’ strongest claim is that Meta had “actual knowledge” children were on its platform. “They got complaints from parents who would say, ‘My 12 year old is on Facebook, please take them down.’ And their policy was not to take them down. I mean, that’s an absolute dead to right, clear violation of COPPA,” Campbell told Tech Policy Press.
RELATED READING:
These COPPA violation claims are absent from some of the individual state attorney general complaints against Meta. The filing from Massachusetts, for instance, opted instead for a “public nuisance” legal strategy that hundreds of school districts across the US have been mounting against the social media companies. “Meta’s conduct has placed an undue burden on the Commonwealth of Massachusetts, including burdens to school systems and increased health care expenditures, to address the mental and physical health harms that Meta has contributed to in Massachusetts youth,” said Massachusetts Attorney General Andrea Joy Campbell in a press release accompanying its filing against Meta.
This cluster of lawsuits against Meta emerged from a nationwide investigation that the Massachusetts Attorney General’s Office co-led in 2021, after Facebook whistleblower Frances Haugen leaked documents showing that Meta knew how toxic Instagram was for teen girls, yet downplayed the issue publicly. Last month, on Nov. 2, 2023, former Meta security expert Arturo Béjar blew the whistle on how Meta was not only failing to protect teens, but additionally facilitating predatory behavior on Instagram. (Béjar testified before the Senate less than a week after the public revelations. A full transcript can be found on Tech Policy Press.) And late last month, the Wall Street Journal reported that Instagram’s algorithm was serving up “risqué footage of kids” to adults who follow children. In response, Meta publicly stated it had created a task force to review existing policies and enforcement systems in its fight against online predators.
On Dec. 5, New Mexico Attorney General Raúl Torrez filed a 225-page lawsuit against Meta and its CEO Mark Zuckerberg for allegedly failing to protect children from sexual abuse, online solicitation, and human trafficking. The suit also argues that the flawed design of Instagram and Facebook led to the recommendation of CSAM and child predator accounts to minor users. “To avoid any doubt, the State does not assert a claim pursuant to its authority to enforce the Children’s Online Privacy Protection Act (“COPPA”), but asserts instead that Meta’s practices in violation of COPPA constitute unfair practices under New Mexico law,” the complaint reads. It accuses Meta of both violating New Mexico’s Unfair Practices Act (UPA) and creating a “public nuisance.”
2. Nationwide litigation tests social media’s liability for youth mental health crisis
The states attorney general lawsuits are not the only coordinated legal challenges being mounted against major social media companies over the design of their products and its potential impacts on the youth mental health crisis. Thousands of individual plaintiffs have filed suit in the past two years. These suits broadly fall into two categories – families and school districts – with distinct legal strategies pursued by each.
Families file personal injury claims against major social media companies
Thousands of families of teenagers and children have filed private personal injury suits over the “unreasonably dangerous” design of social media platforms. They allege that social media companies – particularly Meta (Facebook and Instagram), Google (YouTube), Snapchat, and TikTok – use sophisticated algorithms to intentionally target and addict young users, which has led to dangerous, and at times fatal, physical and mental health outcomes. These suits are an attempt to affirmatively establish social media companies’ strict liability over the design of their products and negligence for the harm experienced. It’s also a novel approach to side-stepping social media companies’ Section 230 and First Amendment immunity claims.
School districts sue social media companies
Seattle School District No. 1 filed a first-of-its-kind lawsuit against Meta, Snap, TikTok, and Google on Jan. 6, 2023. The suit, brought in the US District Court for the Western District of Washington, alleged that the social media companies designed their respective platforms to “exploit the psychology and neurophysiology of their users” for profit. This, in turn, created a youth mental health crisis, which has stretched the Seattle Public Schools’ mental health resources, impacting its “ability to fulfill its educational mission,” and creating a “public nuisance.”
RELATED READING:
Other school districts around the country began suing in waves, and a national coalition of firms formed to represent them and other public entities in pursuing public nuisance claims against social media companies. Just days after Seattle’s lawsuit, Washington’s Kent School District filed a similar complaint. Mesa’s Public Schools, one of Arizona’s largest districts with more than 82 schools, sued the social media companies later that January. California’s San Mateo County School District, or Meta’s home county, filed suit in March. By September, more than 600 school districts had taken some legal action against the social media companies.
3. Consolidating thousands of social media and youth harms lawsuits
On Oct. 6, 2022, a US Judicial Panel on Multidistrict Litigation ordered the consolidation of suits filed in federal courts – both from individual plaintiffs and the school districts – against social media companies Meta (Instagram and Facebook), Snapchat, TikTok, and Google (YouTube) to the Northern District of California before Judge Yvonne Gonzalez Rogers. “Several defendants are headquartered in or near this district, and centralization will facilitate coordination with the state court cases pending in California,” the order wrote, referring to the mostly California-based social media giants. The initial transfer included 28 actions pending across seventeen US districts, but has since risen to at least 429 lawsuits against the social media platforms.
Lawsuits filed in California state courts were also consolidated into a Judicial Council Coordination Proceeding (JCCP) – the California state version of multidistrict litigation – and assigned to Judge Carolyn B. Kuhl in the Superior Court of Los Angeles County. The proceedings involve around 800 cases brought by California residents and school districts against the four social media companies, according to Law.com.
DOCUMENT:
RELATED READING:
Both courts have since allowed their respective complex litigation cases to proceed, and the social media companies must now face the youth addiction and mental health lawsuits. On Oct. 13, 2023, Judge Kuhl issued an 89-page order dismissing plaintiffs’ claims that social media companies are liable for the defective design of their products based on Section 230 and First Amendment grounds, but allowed the case to proceed based on its per se negligence claims. “Plaintiffs also have adequately pled a claim for fraudulent concealment against Defendant Meta,” Judge Kuhl added, referencing claims that Meta failed to warn of the potentially harmful effects of its design features. These “tangible” defective design features, according to Judge Kuhl, include poor age verification laws and a lack of parental controls and notifications.
On Nov. 14, Judge Gonzales Rogers – who does not have to take the JCCP ruling into consideration – issued an order on similar grounds, meaning the plaintiffs’ product liability claims are dismissed, but the negligence per se claims can proceed. Both the JCCP and MDL will now enter into discovery, and platforms will have to comply with requests for information and participate in depositions under oath.
The recent groundswell of judges allowing claims against major social media companies to proceed marks the first time in history that they will be held accountable for the design of their products and the resulting mental health harms, according to Matthew Bergman, founder of the Social Media Victims Law Center. Bergman, a product liability litigation attorney who spent his career fighting asbestos companies, and whose firm filed the first product liability claims against social media platforms based on youth addiction, told Tech Policy Press that Judge Kuhn has provided counsel access to the same sealed material that the 42 state attorneys general used in their explosive filings against Meta.
The Children’s Online Privacy Protection Act (COPPA), enacted in 1998, gives parents more control over the information collected from young children online, particularly those under 13. Online service providers with “actual knowledge” that children under 13 are using their website must follow a series of rules, like obtaining parental consent before collecting children’s personal information or giving parents the right to have their children’s information deleted. Companies can face steep fines for not complying with the law. COPPA is primarily enforced by the Federal Trade Commission (FTC), but states and certain federal agencies also have the authority to enforce compliance within their jurisdictions.
In the 23 years since COPPA went into effect, only 39 enforcement actions have been brought by the FTC. All of these were settled without litigation and most cases involved smaller companies. “Often, settlements merely required the defendant to comply with the law and file periodic reports with the FTC. When the FTC has assessed civil penalties, they have been woefully insufficient to incentivize compliance with COPPA,” said Angela Campbell in her written testimony to the Senate for a subcommittee hearing titled “Protecting Kids Online: Snapchat, TikTok, and YouTube” on March 18, 2021.
The largest civil penalty to date involved an enforcement case from the FTC and New York Attorney General against Google in 2019. As part of a settlement over allegations that its subsidiary, YouTube, illegally collected kids’ personal information with parental consent in violation of COPPA, the company agreed to pay a record $170 million and make changes to its services.
The FTC has zeroed in on large social media companies in recent years. In May, the agency proposed a blanket ban on Meta from monetizing all youth data after it violated a 2020 privacy order related to the Cambridge Analytica scandal that came with a record-breaking $5 billion in civil penalties. The complaint alleges that Meta not only failed to conduct privacy reviews for new products and document its risk mitigation strategies, but it continued to mislead parents about their ability to control who their children under 13 communicated with through their “Messenger Kids” app in violation of COPPA.
Whether the ban will go into effect will depend on determinations over the FTC’s authority. Meta responded to the proposal by calling the move a “political stunt” that usurps the authority of Congress. However, a coalition including the Electronic Privacy Information Center (EPIC) and the Center for Digital Democracy later sent a letter to FTC Chair Lina Khan outlining the ways the Commission is able to modify its 2020 privacy order with Meta. “The FTC’s impetus to secure limitations on minors’ data reflects minor’s unique vulnerability to Meta’s repeated violations of the law, and is well-founded under the Commission’s authority,” the statement read.
Meta and Google have also been accused of COPPA violations by outside groups. In 2019, a coalition including Common Sense Media and the Electronic Privacy Information Center filed a complaint with the FTC using information revealed by a class action lawsuit. The suit, which was settled in 2016, alleged that Meta created a system that “encouraged children to make unknowing and unauthorized credit card purchases” for games and “set up a labyrinthine complaint system to deter refund requests.” And as recently as August, a coalition of parental rights groups urged the FTC to look into YouTube for allegedly still serving up personalized ads on its “made for kids” videos. The FTC has not officially investigated either matter.
Legislative action by the federal government is also possible, although unlikely in the near term. The Kids Online Safety Act (KOSA) and an updated version of COPPA (COPPA 2.0) are the two bills that have the best chance to pass. KOSA is sponsored by Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), and was recently reported out of committee with overwhelming bipartisan support. COPPA 2.0 is sponsored by Sens. Ed Markey (D-MA) and Bill Cassidy (R-LA) and aims to update online data privacy rules in accordance with today’s internet. More specifically, it wants to build on COPPA by raising the age limit of children covered by the law from 13 to 16 years old, creating an “eraser button” for parents and kids to delete a child’s personal information, and establishing a Digital Marketing Bill of Rights for Teens.
While there was renewed hope this fall that KOSA might come up for a floor vote, the Senate is no closer to moving the bill along as the year draws to a close. And since 2024 is an election year, few observers anticipate Congress will get much done.
6. State legislation
As Congress struggles to pass legislation, states are trying their hand at protecting youth online. The California Age-Appropriate Design Code Act (AB-2273), which has been billed by its supporters as a first-in-the-nation law to protect children online, passed unanimously in the California legislature and became law on Sept. 15, 2022. The Act requires online businesses likely to be accessed by children – defined as any user under the age of 18 as determined through “age assurance” methods – to default privacy settings to the highest level and complete an impact assessment before any new products or features are made publicly available.
However, about a year after the CAADCA became law, Judge Beth L. Freeman of the US District Court for the Northern District of California issued a preliminary injunction on Sept. 18, 2023. In the 45-page decision, she concluded that the plaintiffs demonstrated a likelihood of success in proving the Act is facially unconstitutional since it would violate the First Amendment, and such “speech restrictions” would fail strict or even lesser scrutiny.
RELATED READING FROM TECH POLICY PRESS:
Other states face an uphill legislative battle on similar grounds as they consider different child online safety laws. At least 144 different bills spanning 43 state legislatures were introduced just in the first half of 2023. “These laws vary significantly, with some pronounced differences between Democratic and Republican legislators,” Tim Bernard wrote for Tech Policy Press. “In very broad strokes, Democrats have proposed laws similar to the AADC that require platforms to mitigate harms to minors, while Republicans appear to be focused on age verification measures or filtering laws, often to restrict access to pornography,” Bernard explained.
Some of these laws have also been blocked by the courts. On Aug. 31, Arkansas’ Social Media Safety Act (SB 396), which requires a minor to seek a parent or guardian’s consent to open a social media account and mandates age verification, was blocked just hours before it was set to take effect. US District Judge Timothy L. Brooks declared the Act unconstitutional and that requiring users to upload driver’s licenses to the internet would deter adults from using the internet, thus chilling speech. That same day, Judge David A. Ezra enjoined Texas’ HB 1181, which would restrict minors’ access to adult content online.
“We’ve never had companies with so much influence, so much power, and so much money – not just even in the US but globally – and it’s just causing all kinds of harm. And our laws just really aren’t up to the task of addressing those harms,” Angela Campbell told Tech Policy Press. While she thinks much of the current litigation and legislation are long shots, cumulatively, they will have an effect.
Other advocates agree that we may be at a tipping point for holding social media companies accountable on child online safety. “The court of law, the halls of Congress, and the court of public opinion are becoming more sensitized, more aware of the carnage that social media platforms are inflicting on young people, and more convinced that absent government regulation and civil justice measures, these companies are not going to change their behavior,” said Matthew Bergman of the Social Media Victims Law Center. “I think that we have crossed a Rubicon here.”
————————————————