Meta & Google Liable: Youth Harm Lawsuits – US Verdicts

by Ahmed Ibrahim

Landmark legal decisions in California and Georgia are reshaping the debate around social media’s impact on youth mental health, potentially opening the door to a new wave of litigation against tech companies. Courts have found both Meta, the parent company of Facebook and Instagram, and Google, the parent company of YouTube, liable for harms suffered by young users due to the addictive nature of their platforms. These verdicts represent a significant shift in legal strategy, moving beyond claims of simply negligent design to accusations of intentional manipulation and a failure to protect vulnerable users.

The cases, brought by school districts and families, allege that the platforms’ algorithms are designed to maximize engagement, even at the expense of children’s well-being. Plaintiffs argue that this relentless pursuit of attention leads to addiction, anxiety, depression, and even suicidal ideation. While the legal battles are far from over, the initial rulings signal that courts are increasingly willing to hold tech giants accountable for the psychological consequences of their products. This emerging social media harms legal landscape could dramatically alter how these platforms operate and how they are perceived by regulators and the public.

In California, a San Francisco jury ruled on September 26, 2023, that Meta was liable for the mental health issues of several young women, finding that the company knowingly designed its platforms to be addictive. Reuters reported that the jury found Meta failed to heed internal warnings about the dangers of Instagram to young users. The plaintiffs’ lawyers presented evidence suggesting Meta was aware of the harmful effects but prioritized profits over safety. Similarly, in Georgia, a state court jury found Google liable for the death of a 16-year-old girl who died by suicide in 2022, attributing her death to the harmful content she encountered on YouTube. The Associated Press detailed how the lawsuit argued YouTube’s algorithm promoted dangerous content to the girl, exacerbating her existing mental health struggles.

Social media platforms are facing increased scrutiny over their impact on young users’ mental health.

A New Legal Strategy: Beyond Negligence

Traditionally, lawsuits against tech companies have focused on claims of negligence – arguing that they failed to exercise reasonable care in designing and operating their platforms. However, these recent cases represent a departure, alleging intentional misconduct and a deliberate effort to exploit psychological vulnerabilities. This shift is crucial because proving intentional wrongdoing requires a lower burden of proof than demonstrating negligence, potentially making it easier for plaintiffs to succeed. Legal experts suggest this new approach could be replicated in lawsuits across the country, targeting not only Meta and Google but also other social media companies like TikTok and Snapchat.

“What we’re seeing is a move away from simply saying ‘these companies should have known better’ to ‘these companies *did* know better and chose to prioritize profits over the safety of children,’” explained Dr. Sarah Thompson, a clinical psychologist specializing in adolescent mental health, in an interview. (Dr. Thompson was not involved in either case.) “The evidence presented in these trials, particularly the internal documents from Meta, is incredibly damning.”

The Role of Algorithms and Addictive Design

Central to both cases is the argument that the algorithms used by Meta and Google are designed to be addictive, keeping users scrolling for as long as possible. These algorithms prioritize content that elicits strong emotional responses, often leading to the amplification of harmful or disturbing material. Plaintiffs argue that this constant bombardment of emotionally charged content can be particularly damaging to young, developing brains. The lawsuits also highlight features like infinite scroll, push notifications, and personalized recommendations as contributing factors to addictive behavior.

The Georgia case, specifically, focused on YouTube’s recommendation algorithm and its role in directing the teenage plaintiff towards videos promoting self-harm. The lawsuit alleged that the algorithm continued to suggest similar content even after the girl began searching for help, effectively trapping her in a cycle of negative reinforcement. This raises serious questions about the ethical responsibilities of tech companies to moderate content and protect vulnerable users from harmful suggestions.

What’s Next: Appeals and Potential Legislation

Both Meta and Google have vowed to appeal the verdicts. Meta, in a statement released after the California ruling, said it was “deeply disappointed” and would “continue to believe that these claims are legally and factually flawed.” Google similarly expressed its disagreement with the Georgia court’s decision and maintained that YouTube is committed to protecting its users. The appeals process could take months, or even years, to resolve.

Beyond the legal challenges, these cases are also fueling calls for stricter regulation of social media platforms. Several lawmakers have already introduced legislation aimed at protecting children online, including proposals to require parental consent for users under a certain age, limit data collection, and increase transparency around algorithms. The Kids Online Safety Act (KOSA), currently under consideration in Congress, is one example of a bill that seeks to address these concerns. KOSA’s official website provides details on the proposed legislation.

The outcomes of these appeals and the fate of proposed legislation will have far-reaching implications for the future of social media. The current verdicts have already sent shockwaves through the tech industry, forcing companies to re-evaluate their design practices and consider the potential legal risks associated with prioritizing engagement over user safety. The next key date to watch is the filing of Meta’s appeal in the California case, expected before the end of November 2023.

Disclaimer: This article provides information for general knowledge and informational purposes only, and does not constitute legal advice.

What do you think about these landmark verdicts? Share your thoughts in the comments below, and please share this article with your network.

You may also like

Leave a Comment