LOS ANGELES - A high-stakes trial began in California on Monday, centering on whether major social media platforms deliberately crafted their products to addict children and can therefore be held responsible for users' mental-health harms.
The plaintiff, a 20-year-old woman identified in court as Kaley G.M., has brought suit against Meta Platforms - the parent of Facebook and Instagram - and Alphabet’s Google, owner of YouTube. Her attorney, Mark Lanier, told jurors that Kaley became hooked on social media at a young age because of the way the apps were designed, and presented an allegation drawn from internal materials.
"These companies built machines designed to addict the brains of children, and they did it on purpose," Lanier said in his opening statement.
In response, Meta’s lawyer Paul Schmidt sought to frame Kaley’s struggles in the context of preexisting personal circumstances. Schmidt referenced health records that, he said, document a history of verbal and physical abuse and a difficult parental relationship following her parents’ divorce when she was three. Schmidt challenged jurors to consider whether removing Instagram alone would have fundamentally altered Kaley’s life.
YouTube’s attorney is scheduled to deliver an opening statement on Tuesday. Both Meta and Google have denied the allegations in the complaint.
The trial is being watched as a possible test case for whether the design of social media platforms can create legal liability. A verdict against Meta and Google could lower barriers for similar suits in state courts and challenge an established legal defense that platforms have used in U.S. litigation. The outcome may affect an array of pending litigation targeting major social networks.
Kaley’s legal team says the apps worsened her mental-health condition, including depression and suicidal thoughts, and that the companies were negligent in how they designed their products, failed to warn the public of risks, and played a substantial role in her injuries. If the jury accepts those claims, it will decide whether to award damages for pain and suffering and may consider punitive damages.
Meta and Google intend to counter those claims by pointing to other contributing factors in Kaley’s life, detailing their youth-safety work, and arguing that they should not be held responsible for harmful content uploaded by third parties.
Los Angeles Superior Court Judge Carolyn Kuhl instructed jurors that the companies cannot be held liable for recommending content that others created; liability, she said, relates only to the companies’ own design and operation of their platforms. Under prevailing U.S. law, internet companies are broadly shielded from responsibility for material posted by their users, and a jury decision against that protection in this case could open the door for more litigation alleging harmful design.
The case is part of a wider wave of legal action. In federal court, the companies face more than 2,300 similar lawsuits brought by parents, school districts, and state attorneys general. A judge overseeing those consolidated federal claims is weighing the companies’ liability protections ahead of a scheduled first federal trial that could occur as early as June.
Separately, on Monday a jury in Santa Fe, New Mexico heard opening statements in that state’s legal action accusing Meta of profiting from its platforms while exposing children and teenagers to sexual exploitation and mental-health harms. Donald Migliori, representing the New Mexico attorney general, told the jury that while companies pursue profit in the United States, Meta had made money "while publicly misrepresenting that its platforms were safe for youth, downplaying or outright lying about what it knows about the dangers of its platforms."
Meta’s attorney in New Mexico, Kevin Huff, told that jury the company has made significant efforts to protect users and has cautioned that harmful content can appear on platforms despite safeguards.
In the California trial, Meta Platforms CEO Mark Zuckerberg is expected to be called as a witness, and the case is likely to continue into March. TikTok and Snap reached settlements with Kaley before the trial began. Kaley herself is expected to testify.
The legal theory presented by Kaley’s attorneys centers on negligence and failure to warn, asserting that the platforms’ design choices were a substantial factor in producing harm. The defense strategy presented by Meta and Google emphasizes alternative causes, corporate safety measures, and the distinction between platform design and third-party content.
Observers have noted that if jurors reject the companies’ immunity arguments, similar suits alleging that platforms are harmful by design could gain traction.
The litigation in the United States reflects a broader international backlash focused on youth mental health. The article heard that Australia has enacted a prohibition on social media access for users under age 16, and that other countries, including Spain, have proposed similar restrictions for underage users - the reference to Spain concerns a proposal and not an enacted law.
The trial will test the tensions among product design, corporate responsibility, legal protections for online services, and the personal hardships described by the plaintiff. Jurors will now weigh evidence and competing explanations as the case progresses.