Jury Holds Meta and Google Liable for Social Media Addiction

Apr 6, 2026, 2:39 AM
Image for article Jury Holds Meta and Google Liable for Social Media Addiction

Hover over text to view sources

A jury in California has made a significant ruling against Meta and Google, finding the tech giants negligent for their role in the mental health struggles of a young woman, KGM, who became addicted to their social media platforms as a child. This verdict, announced on Wednesday, awarded the plaintiff $6 million in damages, with Meta responsible for 70% of the payout.
This case is notable as it is the first of its kind to hold social media companies accountable for the design of addictive platforms, paralleling the legal battles faced by the tobacco industry in the 1990s. The jury's decision reflects a growing awareness of how these platforms can exploit the vulnerabilities of young users, with KGM testifying that her use of Instagram and YouTube led to severe anxiety and body image issues.
The jury ruled that Meta and YouTube failed to warn users of the dangers associated with their platforms, concluding that their design features—such as infinite scrolling and push notifications—were intentionally addictive. This approach marked a shift away from focusing solely on the content available on these platforms to scrutinizing the underlying architecture that fosters compulsive usage.
KGM, now 20 years old, began using social media at a young age, starting with YouTube at six and Instagram at eleven. During the trial, her legal team argued that the companies' executives were aware of the potential harms yet continued to prioritize user engagement over user safety. Internal documents presented in court indicated that Meta aimed to attract younger users, with some memos suggesting that engaging preteens was crucial for long-term growth.
Despite the jury's findings, both companies have announced plans to appeal the verdict. Meta has stated that teen mental health is complex and cannot be attributed to a single app, while Google has argued that YouTube operates as a responsibly built streaming platform rather than a social media site.
The implications of this ruling extend beyond this single case, as it sets a precedent for approximately 2,000 other lawsuits against social media companies alleging similar harms to minors. Legal experts view this verdict as a potential catalyst for broader changes in how social media platforms operate in relation to children and teenagers.
The trial also coincided with another major ruling against Meta in New Mexico, where the company was ordered to pay $375 million for failing to protect young users from online predators. This series of verdicts highlights a growing movement among plaintiffs' attorneys and lawmakers to hold tech companies accountable for their practices.
As the trial concluded, jurors expressed a desire to send a clear message to tech companies about the dangers of their platforms, emphasizing the need for accountability in the industry. The outcome is anticipated to influence future legal strategies aimed at reforming social media practices, particularly regarding the protection of younger audiences.
This landmark decision reflects a critical turning point in the relationship between technology and mental health, signaling that jurors are ready to hold companies responsible for the consequences of their designs. As the legal landscape evolves, it remains to be seen how these verdicts will shape the future of social media regulation and the responsibility of tech giants towards their youngest users.
With this case, advocates for mental health awareness and responsible technology are hopeful that the tide is turning toward greater scrutiny and reform in the tech industry.

Related articles

Mental Health Workers Demand AI Protections Amid Strikes in California

Mental health workers in California are on strike, advocating for protections against the increasing use of artificial intelligence (AI) in patient care at Kaiser Permanente. They argue that unregulated AI could threaten jobs and compromise patient safety, prompting calls for legislative action and cross-union solidarity.

Meta and Google Found Liable for Social Media Harms to Kids

A Los Angeles jury has ruled that Meta and Google are liable for the mental distress caused to a teenager by their platforms, awarding $3 million in damages. The case highlights concerns about social media addiction and its impact on young users, potentially paving the way for further legal actions against tech giants.

Oregon Lawmakers Push AI Regulations to Safeguard Youth Mental Health

Oregon lawmakers are advancing Senate Bill 1546, which aims to regulate AI chatbots to protect youth from mental health crises. The legislation mandates that chatbots must disclose their artificial nature and provide referrals to mental health resources when users exhibit signs of self-harm or suicidal thoughts.

Oregon Lawmakers Propose AI Chatbot Regulations for Child Safety

Oregon lawmakers are introducing legislation to regulate AI chatbots, aiming to protect children's mental health. The proposed bill mandates monitoring for signs of self-harm, prohibits explicit content for minors, and requires clear disclosures that chatbot responses are not human-generated.

Governor Hochul Proposes Comprehensive Measures for Child Safety

Governor Kathy Hochul has announced a series of proposals aimed at enhancing online safety for children, restricting harmful AI chatbots, and expanding mental health resources for youth in New York. These initiatives are designed to address the growing mental health crisis among young people while ensuring their protection in digital environments.