The California trial (KGM v Meta) was the first in a consolidated action to hold social media companies responsible for the harm caused by their platform design choices. That litigation comprises more than 1,600 plaintiffs, including families and school districts from across the nation.
Case attorney Jayne Conroy told the BBC, “It was a clean sweep with respect to liability against both Google and Meta. It will matter.”
She added, “I bet there’s a lot of math going on in boardrooms at Meta, Google, Snap and TikTok as they evaluate what that means if they know thousands of cases are coming their way.”
There are more than 40 states in the US in which the Attorneys General have filed cases against Meta. Their claims are that young people are suffering a mental health crisis as a result of social media addiction that the parent companies of these platforms were either aware of or deliberately fostered.
Australia, like the rest of the world, has watched these cases keenly, especially having pioneered the social media ban for teenagers.
Shine Lawyers, which has offices across Australia and New Zealand, is among the firms reported to be assessing the viability of a claim. Others include Maurice Blackburn and Slater & Gordon. Shine Lawyers holds the record for the largest product liability class action settlement in Australian legal history, being the $300 million pelvic mesh settlement against Johnson & Johnson and its subsidiary Ethicon, approved in 2023.
Lisa Flynn, Chief Legal Officer at Shine Lawyers, says, “Irrespective of any appeal, the reputational damage to Meta from the findings in this case is substantial. A jury has accepted evidence that core product features were intentionally designed to be addictive and that those design choices caused foreseeable harm. Such findings are significant and impact the way people perceive these platforms.”
To that end, she says tech companies have been put on notice. “Once a jury makes those findings, they become part of the public and legal record, and that has long‑term consequences for how these platforms are viewed by courts, regulators and the community.”
Even if Meta and Google succeed in their appeals, Flynn says the damage is done. “Importantly, reputational harm also affects how future courts approach these cases. The narrative has changed from ‘complex social issue’ to questions about product design, duty of care and corporate knowledge.”
So, emboldened by the US cases, will Australia see class actions? “Australian families are watching this closely because the same products, the same design features and the same risks exist here,” says Flynn. “While our legal framework differs from the United States, Australian courts are increasingly willing to scrutinise the conduct of global corporations where harm occurs locally.”
The US verdict doesn’t automatically translate into liability here, she explains, “but it does demonstrate that complex technology cases can be understood by juries and courts when the evidence is properly tested”.
She tells LSJ Online that there is a real prospect of Australian claims, particularly where harm to children and adolescents can be supported by medical, psychological and usage evidence. “There is community appetite to ask whether these platforms crossed a line from being engaging to being harmful, particularly for children. We are already seeing families asking whether the law offers them any avenue for accountability.”
There is no direct Australian precedent yet on social media addiction of this kind, Flynn acknowledges. “[B]ut that’s how emerging areas of law develop. We have seen similar evolution in tobacco, asbestos and financial services litigation.
“Courts don’t require a claim to have been run before, but they do require credible evidence, strong legal theory, and clear harm to have occurred because of the wrongdoing. The US cases are helping to establish a framework which might be applied here.”
Flynn confirms that Shine Lawyers is currently working through enquiries and investigating how an Australian claim could be run, “including whether the evidence supports a sustainable cause of action”.
In the California case, Kaley’s attorney Mark Lanier won over the US media with some charismatic flourishes during the trial. At one point, to illustrate the immensity of Meta’s fortunes, he presented the jury with a jar of M&Ms, with each valued at $1 billion. It would take 1,400 M&Ms to represent the $US1.4 billion Meta is valued at.
Failing to fix the problem
In a statement, Lanier said: “The evidence showed that Meta and YouTube knew their platforms were hooking children and harming their mental health, and instead of fixing the problem they kept developing features to maximize the time kids spent on their apps. Now a jury has told them that is not acceptable, and you are being held accountable.”
In the first verdict to be announced, a jury in New Mexico found that Meta had misled users of Facebook and Instagram to believe that these apps were safe, despite the platforms enabling child sexual exploitation and human trafficking. In the case, brought by the New Mexico Attorney General, Meta was ordered to pay $US375 million ($538 million). State prosecutors had asked for more than $2 billion.
The plaintiff in the California case, Kaley, claimed that she became compelled to use Instagram and YouTube all day as a result of the algorithms designed to hook in users and keep them on the apps. Importantly, the content itself was not in question. Technology companies are legally shielded from liability for posted content under Section 230 of the Communications Decency Act. Plaintiff attorneys successfully advanced a theory that treated social media platforms as defective products under California product liability law, centred on the design features the companies engineered to maximize engagement.
Following over 40 hours of deliberations, the jury awarded the plaintiff $US3 million ($4.3 million) in damages and recommended a further $US3 million in punitive damages based upon a decision that the companies acted with malice, oppression or fraud in harming children.
In the California case, the jurors found that both companies knew or should have known their services posed a danger to minors, that they failed to adequately warn users of that danger, and that a reasonable platform operator would have done so.
According to Lanier’s press statement, the verdict against Meta and Google is expected to significantly influence settlement negotiations and trial outcomes in the remaining cases, as well as in the federal multidistrict litigation of more than 2,300 pending cases in the Northern District of California, the first of which is scheduled to begin in June 2026.
“Accountability has arrived,” lawyers for the plaintiff said in a statement. A spokesperson for Meta said they “respectfully disagree” with the verdict and would weigh their options.
The panel assigned Meta 70 per cent of the responsibility for the plaintiff’s harm – a US$2.1m ($3.02m) share of the compensatory award – and YouTube the remaining 30 per cent, or $US900,000 ($1.29m).
Meta has responded that it would appeal both the California and New Mexico cases, while Google has said it would appeal only the California case.
Government reaction here
On 26 March, Communications Minister Anika Wells shared a Facebook post on the verdict with the caption: “The drum beat against social media harm is getting louder.”
On 28 March, Wells shared a post to the same platform stating: “I have significant concerns that social media platforms are failing to obey Australia’s world-leading law. [The] eSafety Commissioner and I will have more to share next week.”
The federal government has not yet implemented the Digital Duty of Care legislation it promised, which would require tech companies such as Meta and Google to ensure their platforms do not cause harm to Australian users. The consultation on the development of a Digital Duty of Care under the Online Safety Act 2021 (Cth) closed in December 2025.
On 25 March, the Australian Government extended the definition of social media platforms that must comply with Australia’s under-16s social media ban to include those that have systems “designed to be addictive and provide constant dopamine hits”, and those “designed to create urgency so young people check apps constantly”.
Wells issued a statement that said: “Targeted algorithms, doomscrolling, persistent notifications and toxic popularity metres are stealing their attention for hours every day.”
On 30 March, the eSafety Commissioner, the Human Rights Commissioner, and MPs met with child safety advocates and online safety experts at Parliament House to discuss concerns and responses to the alleged rise in livestreamed child sexual abuse in Australia and the broader Asia Pacific region.
In February, an investigation by Queensland Police Crime Command resulted in a 27-year-old man being charged with 596 child abuse related offences. As part of Operation Xray Wick, begun in February last year, detectives located videos and images relating to hundreds of victims. The man was arrested at that time in relation to child abuse related offences and has been in custody since February 2025.
Specialist investigators conducted digital forensic examinations, locating over 23,000 videos and images of his offending against 459 victims across multiple jurisdictions in Australia and overseas. Police will allege the man self-produced the child abuse material, and that he actively targeted children on social media and gaming platforms between 2018 and 2025. The victims were primarily aged between 7 and 15 years.
A report by IJM and global child safety institute Childlight found one in 15 Australian men surveyed either webcammed sexually with a child or had a desire to do so.
In relation to the California and New Mexico cases, Shine Lawyers’ Flynn says, “Ultimately, you shouldn’t be allowed to create a harmful product and get away with it. Like many Australians, we’re watching this case closely.”
She adds: “What this US case has done is give people confidence that these claims are grounded in fact. Product design and intention is being factored into legal argument, and this is a powerful development for social media users.”
————————————————
