Stock Markets February 18, 2026

Landmark Trial Opens Over Alleged Design of Social Apps to Hook Children

California jury hears opening arguments in lawsuit accusing Meta and Google’s YouTube of building products intended to addict young users

By Marcus Reed META GOOGL SNAP
Landmark Trial Opens Over Alleged Design of Social Apps to Hook Children
META GOOGL SNAP

A California trial opened as a 20-year-old plaintiff alleges Meta Platforms and Google’s YouTube intentionally engineered social apps to addict children, fueling her depression and suicidal thoughts. Lawyers for the plaintiff say internal documents reveal purposeful design choices to capture young users; the tech companies deny the claims and say other life factors explain her harms. The case could clarify whether platforms can be held liable for their design and influence thousands of pending lawsuits.

Key Points

  • A 20-year-old plaintiff alleges Meta and Google’s YouTube intentionally designed apps to addict children; the trial will determine whether platform design can create legal liability.
  • Defense arguments will emphasize the plaintiff’s personal history and highlight companies’ youth-safety efforts; jurors were instructed that liability applies to platform design and operation, not to content posted by third parties.
  • The case could influence thousands of similar state and federal lawsuits and has potential implications for the technology and legal sectors, particularly social-media companies and their risk exposures.

LOS ANGELES - A high-stakes trial began in California on Monday, centering on whether major social media platforms deliberately crafted their products to addict children and can therefore be held responsible for users' mental-health harms.

The plaintiff, a 20-year-old woman identified in court as Kaley G.M., has brought suit against Meta Platforms - the parent of Facebook and Instagram - and Alphabet’s Google, owner of YouTube. Her attorney, Mark Lanier, told jurors that Kaley became hooked on social media at a young age because of the way the apps were designed, and presented an allegation drawn from internal materials.

"These companies built machines designed to addict the brains of children, and they did it on purpose," Lanier said in his opening statement.

In response, Meta’s lawyer Paul Schmidt sought to frame Kaley’s struggles in the context of preexisting personal circumstances. Schmidt referenced health records that, he said, document a history of verbal and physical abuse and a difficult parental relationship following her parents’ divorce when she was three. Schmidt challenged jurors to consider whether removing Instagram alone would have fundamentally altered Kaley’s life.

YouTube’s attorney is scheduled to deliver an opening statement on Tuesday. Both Meta and Google have denied the allegations in the complaint.


The trial is being watched as a possible test case for whether the design of social media platforms can create legal liability. A verdict against Meta and Google could lower barriers for similar suits in state courts and challenge an established legal defense that platforms have used in U.S. litigation. The outcome may affect an array of pending litigation targeting major social networks.

Kaley’s legal team says the apps worsened her mental-health condition, including depression and suicidal thoughts, and that the companies were negligent in how they designed their products, failed to warn the public of risks, and played a substantial role in her injuries. If the jury accepts those claims, it will decide whether to award damages for pain and suffering and may consider punitive damages.

Meta and Google intend to counter those claims by pointing to other contributing factors in Kaley’s life, detailing their youth-safety work, and arguing that they should not be held responsible for harmful content uploaded by third parties.

Los Angeles Superior Court Judge Carolyn Kuhl instructed jurors that the companies cannot be held liable for recommending content that others created; liability, she said, relates only to the companies’ own design and operation of their platforms. Under prevailing U.S. law, internet companies are broadly shielded from responsibility for material posted by their users, and a jury decision against that protection in this case could open the door for more litigation alleging harmful design.


The case is part of a wider wave of legal action. In federal court, the companies face more than 2,300 similar lawsuits brought by parents, school districts, and state attorneys general. A judge overseeing those consolidated federal claims is weighing the companies’ liability protections ahead of a scheduled first federal trial that could occur as early as June.

Separately, on Monday a jury in Santa Fe, New Mexico heard opening statements in that state’s legal action accusing Meta of profiting from its platforms while exposing children and teenagers to sexual exploitation and mental-health harms. Donald Migliori, representing the New Mexico attorney general, told the jury that while companies pursue profit in the United States, Meta had made money "while publicly misrepresenting that its platforms were safe for youth, downplaying or outright lying about what it knows about the dangers of its platforms."

Meta’s attorney in New Mexico, Kevin Huff, told that jury the company has made significant efforts to protect users and has cautioned that harmful content can appear on platforms despite safeguards.


In the California trial, Meta Platforms CEO Mark Zuckerberg is expected to be called as a witness, and the case is likely to continue into March. TikTok and Snap reached settlements with Kaley before the trial began. Kaley herself is expected to testify.

The legal theory presented by Kaley’s attorneys centers on negligence and failure to warn, asserting that the platforms’ design choices were a substantial factor in producing harm. The defense strategy presented by Meta and Google emphasizes alternative causes, corporate safety measures, and the distinction between platform design and third-party content.

Observers have noted that if jurors reject the companies’ immunity arguments, similar suits alleging that platforms are harmful by design could gain traction.


The litigation in the United States reflects a broader international backlash focused on youth mental health. The article heard that Australia has enacted a prohibition on social media access for users under age 16, and that other countries, including Spain, have proposed similar restrictions for underage users - the reference to Spain concerns a proposal and not an enacted law.

The trial will test the tensions among product design, corporate responsibility, legal protections for online services, and the personal hardships described by the plaintiff. Jurors will now weigh evidence and competing explanations as the case progresses.

Risks

  • If jurors find the companies liable for platform design, it could increase legal exposure and potential damages for social-media firms, affecting the technology sector and corporate legal costs.
  • A verdict against the companies would likely encourage additional suits claiming design-driven harms, increasing uncertainty for companies, insurers, and investors tied to social platforms.
  • Regulatory and policy responses may accelerate internationally, with governments considering curbs on youth access; such measures could alter user engagement patterns and revenue models in the social-media industry.

More from Stock Markets

U.S. Court Ruling Eases Tariffs but Leaves European Exporters Facing Renewed Uncertainty Feb 21, 2026 UBS Sees Continued Execution at Walmart After Strong Q4; Digital and High-Margin Layers Drive Outlook Feb 21, 2026 Failed $4B Financing for Lancaster Data Center Tied to CoreWeave’s B+ Credit Score Feb 20, 2026 Raymond James Says JFrog Sell-Off Overstates Threat from Anthropic’s New Security Tool Feb 20, 2026 FERC Clears Path for Blackstone-TXNM Energy Deal, Removing Major Federal Hurdle Feb 20, 2026