A New Mexico court has ordered Meta, the parent company of Facebook, Instagram, and WhatsApp, to pay $375m after a jury found the company misled users about the safety of its platforms for children. The verdict marks the first time a state has successfully sued Meta over child safety issues, according to BBC.

Historic Verdict in Child Safety Case

New Mexico Attorney General Raul Torrez called the verdict “historic” and said it represents a turning point in holding tech companies accountable for the safety of young users. The jury found that Meta violated the state’s Unfair Practices Act by misleading the public about the risks its platforms posed to minors — the case was the first of its kind, with no other state having previously won a similar lawsuit against the company.

The trial. Which lasted seven weeks. Presented internal Meta documents and testimony from former employees — these included accounts of how the company had known about child predators using its platforms and had failed to act. Arturo Béjar. A former Meta engineering leader who became a whistleblower in 2021, testified that he ran experiments on Instagram and found that underage users were shown sexualized content. He also revealed that his daughter was propositioned for sex by a stranger on the platform.

State prosecutors showed internal research from Meta that, at one point, found 16% of all Instagram users had reported being shown unwanted nudity or sexual activity in a single week. This data was used to support claims that the company had been aware of the risks but failed to address them.

Meta’s Defense and Appeal Plans

A Meta spokeswoman said the company disagrees with the verdict and plans to appeal. She stated. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors and harmful content — we remain confident in our record of protecting teens online.”

The $375m penalty was calculated based on the jury’s determination that there were thousands of violations of the Unfair Practices Act, each with a maximum penalty of $5,000. This approach allowed the court to impose a large civil fine based on the number of alleged violations.

Meta is also facing a separate trial in Los Angeles, where a young woman claims she became addicted to platforms like Instagram and YouTube, owned by Google, as a child due to their design. This case is part of a wave of similar lawsuits across the United States, many of which involve claims about the addictive nature of social media and its effects on minors.

New Mexico sued Meta in 2023. Alleging that the company used recommendation algorithms to steer young users toward content that was sexually explicit, showed child sexual abuse, or exposed them to solicitation of such material and sex trafficking. The state argued that Meta executives knew their products harmed children but disregarded warnings from their own employees and lied to the public about what they knew.

Broader Implications for Tech Companies

Attorney General Raul Torrez said, “Today the jury joined families, educators, and child safety experts in saying enough is enough.” The ruling could set a precedent for other states considering similar lawsuits against major tech companies. It may also increase pressure on Meta to implement stronger safeguards for young users.

Meta has taken steps in recent years to improve child safety on its platforms. In 2024, Instagram launched Teen Accounts, giving young users more control over their experience. Last month, the company introduced a feature that would alert parents if their children are searching for self-harm content.

Despite these efforts, critics argue that more needs to be done. The case highlights the growing scrutiny of how social media companies design their platforms and the impact these designs have on young users. With the rise of similar lawsuits and increased public concern about online safety, the outcome of this case may influence future regulations and company policies.

The case also raises questions about the effectiveness of current content moderation practices and the role of algorithms in shaping user experiences. As more states consider legal action, the pressure on Meta and other tech giants to address these issues is likely to increase.

The next phase of the case will involve Meta’s appeal process, which could take months or even years to resolve. In the meantime, the ruling serves as a significant legal and public relations challenge for the company.