Research4 min readBBC Tech

Meta and YouTube found liable in landmark social media addiction trial

P
Redakcja Pixelift0 views
Share
Meta and YouTube found liable in landmark social media addiction trial

Foto: BBC Tech

Six million dollars in damages – this is the amount Meta and Google must pay a 20-year-old plaintiff in a landmark trial regarding social media addiction. A jury in Los Angeles ruled that the tech giants intentionally designed their platforms as "addiction machines," which led the young woman to depression, anxiety, and body dysmorphic disorder. Meta was held responsible for 70% of the awarded amount, with Google liable for the remaining 30%, as the judges emphasized that the companies acted in bad faith and committed fraud in the way they managed their algorithms. This verdict represents a turning point in the global debate over Big Tech's responsibility for the mental health of youngest users. Although Meta and Google announced an appeal, arguing that teen health is a complex issue that cannot be attributed to a single app, the verdict opens the door for hundreds of similar lawsuits awaiting trial. For users worldwide, this signifies a real chance to force changes in the architecture of services such as Instagram or YouTube. Features like infinite scroll or beauty filters, previously treated as standard engagement tools, were directly linked in this trial to a destructive impact on the psyche. The plaintiff's success could become a catalyst for new legal regulations, forcing platforms to abandon the most invasive mechanisms for capturing the attention of children and adolescents.

Parents and loved ones of victims outside the court in Los Angeles
Parents and loved ones of victims gathered outside the court in Los Angeles to hear the historic verdict.

The mechanism of the "addiction machine" under court scrutiny

The arguments from Kaley's lawyers struck at the very foundations of the social media platforms' business model. During the five-week trial, it was demonstrated that features such as **infinite scroll** are not merely interface conveniences, but precisely designed mechanisms aimed at maximizing time spent in the application. Kaley testified that she started using **YouTube** at the age of six and **Instagram** as a nine-year-old, despite official regulations prohibiting access to individuals under the age of 13. Age verification systems proved to be a fiction in this case. Kaley described a process of gradual isolation from her family and a deepening obsession with her own appearance. The use of filters that alter facial features—narrowing the nose and enlarging the eyes—led to her being diagnosed with body dysmorphia. This condition, combined with depression and anxiety, became the foundation of the claim that the jurors found fully justified. Meta, which is to cover 70% of the awarded amount, defended itself by claiming that the mental health of teenagers is a matter too complex to be linked to a single application. Meanwhile, Google representatives attempted to distance **YouTube** from the definition of social media, calling it a "responsibly built streaming platform." However, these arguments did not convince the jury, which saw bad faith in the corporations' actions.

Confrontation with facts and internal data

The key moment of the trial was the questioning of **Mark Zuckerberg**. The Meta chief tried to argue that the company cares about the safety of the youngest users, but he was confronted with internal documents. These clearly showed that the Menlo Park giant was well aware of millions of children below the age limit using the platform, and the pace of work on identifying them was—as Zuckerberg himself admitted—far from ideal. Even more controversial was the testimony of **Adam Mosseri**, the head of Instagram. When presented with evidence that Kaley spent up to 16 hours a day in the app, Mosseri refused to call it an addiction. He described such extreme usage merely as "problematic." This stance by the management, combined with testimony from former Meta employees about prioritizing the growth of young users over their safety, tipped the scales in favor of the plaintiff.
Graphic symbolizing the lawsuit against tech giants
The verdict in Los Angeles is a signal that the era of Big Tech's lack of responsibility for users' mental health is coming to an end.

Global shock and consequences for the industry

The verdict in Los Angeles is not an isolated case, but part of a broader trend. Just a day earlier in New Mexico, Meta was found guilty of exposing children to sexual predators and explicit content. Mike Proulx from the research firm Forrester notes that we are dealing with a tipping point. Negative public sentiment toward social media, built up over years, has finally found its outlet in the legal system. It is worth noting the financial dimension of the verdict:
  • $3 million in compensatory damages.
  • $3 million in punitive damages.
  • Payment split: 70% Meta, 30% Google.
  • Earlier settlements with TikTok and Snap (amounts undisclosed).
Governments around the world are already reacting to these reports. Australia is introducing drastic age restrictions, and the UK is testing pilot programs banning social media use for those under 16. For technology creators and UX designers, the message is clear: user engagement features cannot be designed in isolation from their psychological effects.

A new standard of responsibility for digital products

Kaley's victory is a signal that recommendation algorithms and user retention mechanisms are no longer being treated as a "black box" for which companies take no responsibility. If courts begin to mass-classify platforms as "addiction machines," we face a fundamental restructuring of how we consume content online. Meta and Google have announced an appeal; however, regardless of the outcome, a precedent has been set. The tech industry must prepare for the fact that engagement metrics, which were previously a source of pride for investors, may become evidence for the prosecution in the courtroom. Another major case against tech giants begins as early as June in a federal court in California. Everything indicates that the era in which a 16-hour Instagram session was called merely "problematic" is passing irrevocably, and responsibility for the impact of AI and algorithms on the psyche will become a standard element of compliance for every global technology company.
Source: BBC Tech
Share

Comments

Loading...