

A US court ruling has dealt a major blow to social media giant Meta Platforms, ordering it to pay $375 million for misleading users about the safety of children on its platforms—raising fresh concerns over how Big Tech handles online risks for minors.
A jury in New Mexico concluded that Meta violated consumer protection laws by creating a false sense of safety around platforms such as Instagram, Facebook and WhatsApp.
The verdict held the company responsible for exposing children to sexually explicit content and potential contact with online predators. The case was brought under the state’s Unfair Practices Act, with jurors finding thousands of violations that together resulted in the hefty penalty.
New Mexico attorney general Raul Torrez described the ruling as “historic”, marking the first successful state-level lawsuit against Meta over child safety.
He accused the company of knowingly downplaying risks, alleging that its leadership ignored internal warnings and misled the public about the extent of harm faced by young users.
During the seven-week trial, prosecutors presented internal research suggesting widespread exposure to harmful content. At one stage, Meta’s own findings indicated that 16 percent of Instagram users reported seeing unwanted nudity or sexual material within a week.
Former Meta engineer Arturo Béjar testified that company experiments showed underage users were routinely served sexualised content. He also recounted that his own daughter received inappropriate propositions from strangers on Instagram.
Meta, led by chief executive Mark Zuckerberg, said it strongly disagrees with the verdict and plans to appeal. The company maintains it has invested heavily in safety measures and continues to improve protections for younger users.
Recent initiatives include enhanced “Teen Accounts” on Instagram and tools to alert parents if children search for self-harm-related content.
The ruling comes amid a broader wave of litigation in the US targeting social media platforms over their impact on children. Thousands of similar cases are currently progressing through courts.
Meta is also facing a separate lawsuit in Los Angeles, where a plaintiff alleges addiction to platforms such as Instagram and YouTube due to their design.
The New Mexico case had argued that Meta’s recommendation algorithms actively steered minors towards explicit and harmful content—an allegation that could have far-reaching implications for how digital platforms are regulated going forward.