British teens to trial social media ban with Lords and Labour in standoff over measure

A jury in New Mexico found the tech giant misled users over the safety of Facebook, Instagram and WhatsApp, and failed to do enough to protect children.

Share

Meta has been ordered to pay $375 million in civil penalties after a jury in New Mexico found it knowingly harmed children’s mental health.

The company, which owns Facebook, Instagram and WhatsApp, was accused of misleading users about how safe its platforms were and enabling child sexual exploitation.

Meta denied breaching New Mexico’s consumer protection law and has said it will appeal.

The verdict came after a seven-week trial and is likely to be closely watched as similar cases play out across the US.

Announcing the outcome, New Mexico attorney general Raúl Torrez said: “The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.”

He added: “Meta executives knew their products harmed children, disregarded warnings from their own employees and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”

The case relied in part on an undercover investigation in which state agents created social media accounts posing as children to document sexual approaches from adults, and Meta’s response.

Prosecutors also argued the company failed to tackle the dangers of social media addiction, saying features including infinite scroll and autoplay videos kept younger users hooked and could contribute to depression, anxiety and self-harm.

During closing arguments, lawyer Linda Singer told the jury: “Over the course of a decade, Meta has failed over and over again to act honestly and transparently.

“It’s failed to act to protect young people in this state. It is up to you to finish this job.”

Meta defended its safety policies during the trial.

Its lawyer Kevin Huff said: “Evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business.

“Meta designs its apps to help people connect with friends and family, not to try to connect predators.”

A Meta spokesperson said after the verdict: “We respectfully disagree with the verdict and will appeal.”

They added: “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content.

“We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”