Key takeaways:
- A New Mexico jury found Meta Platforms Inc. violated state consumer protection laws by harming children’s mental health and failing to address child sexual exploitation, resulting in thousands of violations with significant penalties.
- A related civil trial in Los Angeles involves allegations that Meta and YouTube designed addictive products harming children’s mental health, with over 1,600 plaintiffs involved and verdict details pending.
- These cases reflect growing legal scrutiny of social media companies’ impact on youth mental health, challenging Section 230 protections and signaling a shift toward greater accountability for online platforms.
A New Mexico jury has ruled that Meta Platforms Inc., the parent company of Instagram, Facebook, and WhatsApp, violated state consumer protection laws by harming children’s mental health and failing to adequately address child sexual exploitation on its platforms. The verdict, delivered after a nearly seven-week trial, found that Meta engaged in thousands of violations of New Mexico’s Unfair Practices Act, with each violation carrying a penalty of $375 million. The jury concluded that Meta prioritized profits over safety, made false or misleading statements, and engaged in “unconscionable” trade practices that exploited children’s vulnerabilities.
The lawsuit, filed in 2023 by New Mexico Attorney General Raúl Torrez, was based in part on an undercover investigation where state agents posed as children on social media to document sexual solicitations and Meta’s responses. Prosecutors argued that Meta’s algorithms promoted harmful content and that the company failed to fully disclose or address the risks of social media addiction, which Meta disputes. Meta’s attorneys maintained that the company works to keep users safe and that some harmful content inevitably slips through despite their efforts. Meta spokesperson Andy Stone stated the company “respectfully disagree[s] with the verdict and will appeal,” emphasizing their commitment to protecting teens online.
In a related development, a separate civil trial in Los Angeles involving Meta and YouTube also addressed allegations that social media platforms deliberately designed products to be dangerously addictive to children. The case, brought by a plaintiff identified as K.G.M., a now-20-year-old woman who testified that her extensive social media use contributed to depression, anxiety, and body dysmorphia, is the first in a consolidated group of lawsuits involving over 1,600 plaintiffs, including families and school districts. After nearly nine days of deliberations, the jury reached a verdict, though details of the decision were pending at the time of reporting. Meta and YouTube have denied claims that their platforms are purposefully harmful or addictive.
These trials come amid increasing scrutiny of social media companies’ roles in public health and safety, particularly concerning children and adolescents. More than 40 state attorneys general have filed lawsuits alleging that Meta’s platforms contribute to a mental health crisis among young people by designing addictive features. The cases challenge longstanding legal protections under Section 230 of the Communications Decency Act, which shields internet companies from liability for user-generated content. Legal experts and advocates note that these trials mark a significant shift, demonstrating that social media companies can be held accountable in court. Matt Bergman, founding attorney of the Social Media Victims Law Center, said the progression to trial is a milestone for victims seeking justice and transparency, with many more cases expected in the future.






Be First to Comment