Meta Ordered to Pay $375 Million in Child Exploitation Case

Meta has been ordered to pay $375 million after a New Mexico jury found the tech giant liable for failing to protect children from exploitation and harmful content on its platforms.

A jury in New Mexico found that Meta, which owns Facebook, Instagram, and WhatsApp, failed to safeguard young users from online risks, including sexually explicit material, solicitation, and human trafficking.

The company was also found liable for misleading consumers about the safety of its platforms and endangering children under the state’s consumer protection laws. Jurors imposed the maximum penalty for each violation, resulting in a total of $375 million in civil penalties.

The decision marks the first time a jury trial has found Meta liable for conduct occurring on its platforms, according to a report by The Guardian.

“The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” New Mexico Attorney General Raúl Torrez says in a statement. “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”

The lawsuit was filed by Torrez’s office in December 2023. The lawsuit alleged that the company allowed predators to access underage users and connect with victims, sometimes leading to real-world abuse and human trafficking.

During the seven-week trial, jurors reviewed internal company documents and heard testimony from former employees indicating that Meta was aware of child predators using its platforms. The BBC reports that Arturo Béjar, a former engineering leader at Meta who left the company in 2021 and later became a whistleblower, testified about internal experiments on Instagram that showed underage users were exposed to sexualized content. He said his own daughter was propositioned for sex by a stranger on the platform. State prosecutors also presented internal research from Meta showing that, at one point, 16% of Instagram users reported being shown unwanted nudity or sexual activity within a single week.

“Over the course of a decade, Meta has failed over and over again to act honestly and transparently,” Linda Singer, an attorney for the state, told the jury during closing arguments. “It’s failed to act to protect young people in this state.”

Meta denied the allegations and says it has extensive measures in place to protect younger users. A spokesperson for the company says it disagrees with the verdict and plans to appeal.

“We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content,” the Meta spokesperson says. “We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”

The case is one of dozens of similar lawsuits brought by state attorneys general against Meta and other social media companies.

Meta is facing thousands of other lawsuits accusing it and other social media companies of designing platforms to encourage compulsive use among young people with critics linking these practices to a broader mental health crisis across the U.S.

Image credits: Header photo licensed via Depositphotos.