A landmark legal decision in Los Angeles has marked a significant moment in the ongoing debate over social media's impact on young people's mental health. A jury awarded $6 million in damages to a 20-year-old woman, identified only as Kaley, who sued social media giants Meta and Google, claiming that their platforms caused her childhood addiction and subsequent mental health issues. This verdict, which holds Meta and Google accountable for intentionally designing addictive social media experiences that harmed Kaley, has been hailed as a potential catalyst for similar lawsuits across the United States.
Kaley's lawsuit accused Meta, the parent company of Instagram, Facebook, and WhatsApp, and Google, owner of YouTube, of deliberately building platforms that ensnared young users and contributed to mental health disorders. The jury agreed, awarding $3 million in compensatory damages and an additional $3 million in punitive damages. The punitive damages were imposed on the basis that the companies acted with "malice, oppression, or fraud" in the way they operated their platforms. Meta is responsible for 70% of the damages, with Google covering the remaining 30%.
The verdict has been welcomed by parents and advocacy groups campaigning for stricter regulations on social media. Outside the courthouse, families of children who were not involved in the lawsuit but who claim to have been similarly harmed gathered to celebrate the outcome. Amy Neville, one such parent, was seen embracing other supporters, reflecting the deep emotional impact of the verdict on those affected by social media's darker effects.
This decision comes amid increasing scrutiny of social media companies' responsibilities toward young users. It follows closely on the heels of another jury ruling in New Mexico that found Meta liable for exposing children to sexually explicit content and contact with predators. These back-to-back verdicts suggest a growing judicial willingness to hold social media companies accountable for the negative effects their platforms may have on children and teens.
Experts see these outcomes as signals of a "breaking point" in the relationship between social media companies and the public. Mike Proulx, research director at advisory firm Forrester, noted that negative public sentiment toward social media has been building for years and has now reached a critical mass. Governments worldwide are beginning to respond: Australia has implemented restrictions to curb children's social media use, and the UK is conducting a pilot program to explore banning social media access for those under 16.
Political leaders have also weighed in. UK Prime Minister Sir Keir Starmer acknowledged that the current state of affairs is "not good enough" and emphasized the need for more robust protections for children online. He highlighted the government's consultation on potentially banning social media for under-16s, stating, "It's not if things are going to change, things are going to change. The question is, how much and what are we going to do?"
In a parallel statement, the Duke and Duchess of Sussex, known for their advocacy on social media harms, called the verdict a "reckoning" and urged that children's safety be prioritized above profit.
During the trial, Meta's CEO Mark Zuckerberg testified, reiterating the company's longstanding policy that users under 13 are not allowed on its platforms. However, internal documents revealed that Meta was aware many young children were using Instagram and Facebook regardless. Zuckerberg expressed regret that the company had not moved faster to identify and block underage users but maintained that over time, they had reached the right approach.
The trial primarily focused on Instagram and Meta, although Google and YouTube were also defendants. Other social media companies, Snap and TikTok, had initially been named but settled with Kaley out of court before the trial began.
Kaley's legal team portrayed the platforms as "addiction machines," designed to keep young users engaged for extended periods. Kaley herself testified that she started using Instagram at age nine and YouTube at six, with no age verification or blocking efforts. She described how her social media use led her to withdraw from family interaction and contributed to early symptoms of anxiety and depression, which she was later formally diagnosed with. Moreover, she developed body dysmorphia, a disorder linked to excessive concern about physical appearance, after frequently using Instagram filters that altered her facial features.
The lawyers argued that Instagram features such as infinite scroll were intentionally created to be addictive. They presented testimony from former Meta executives and experts, asserting that the company's growth strategy specifically targeted young users, who were more likely to remain engaged over time. When Kaley's legal team highlighted that she had spent up to 16 hours in a single day on Instagram, Instagram's head Adam Mosseri downplayed this behavior, calling it "problematic" rather than an indication of addiction.
Following the verdict, Kaley's attorneys stated that the jury's decision "sends an unmistakable message that no company is above accountability when it comes to our children." The case is expected to influence hundreds of similar lawsuits currently progressing through US courts, setting a precedent for holding social media companies responsible for the welfare of their young users.
Looking ahead, another significant case against Meta and other social media firms is scheduled to begin in June in a federal court in California. This ongoing legal scrutiny reflects growing societal concern over the influence of social media on minors and may prompt further regulatory and legislative action.
As public awareness and legal challenges mount, governments, families, and advocacy groups continue to push for stronger measures to protect children from the potential harms of social media. The recent Los Angeles verdict marks a pivotal moment in this evolving narrative, underscoring the urgent need for social media companies to address the mental health risks their platforms may pose to young users.
