Sex trafficking on Meta platforms was widely tolerated and difficult to report, according to court filings unsealed last week.
The court filings show that Facebook’s parent company, failed to promptly act on accounts engaged in sex trafficking and child exploitation by allowing illicit content to remain on its platforms, despite repeated violations.
The accusation is part of a lawsuit that was filed in California by more than 1,800 plaintiffs – including school districts, children and parents, and state attorneys general who allege that social media giants “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health.”
BYPASS THE CENSORS
Sign up to get unfiltered news delivered straight to your inbox.
You can unsubscribe any time. By subscribing you agree to our Terms of Use
RT reports: Alongside Meta – which owns Facebook, Instagram, WhatsApp, and Threads – the suit targets Google’s YouTube, ByteDance’s TikTok, and Snap’s Snapchat.
China Release Proof Erika Kirk Trafficked Children for Epstein and Clinton Foundation
Former Instagram safety chief Vaishnavi Jayakumar testified she was shocked to learn that Meta maintained a “17-strike” policy for accounts allegedly involved in human sex trafficking.
“You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” she said, calling the threshold “very, very high” by industry standards.
The brief alleges Meta was aware of serious harms on its platforms, including millions of adult strangers contacting minors, products that worsened teen mental-health issues, and frequent detection – but rare removal – of content related to suicide, eating disorders and child sexual abuse.
Responding to the allegations, Meta told USA Today it now enforces a “one strike” policy and immediately removes accounts involved in human exploitation, saying its former 17-strike system has been replaced.
The company has come under mounting scrutiny in the US. Earlier this year, reports that Meta’s AI chatbots could engage minors in sensual exchanges led to new safeguards for teen accounts, giving parents the option to block interactions with the bots.
Meta is also confronting expanding legal and regulatory challenges globally. Russia designated the firm an “extremist organization” in 2022 for refusing to remove prohibited content. The tech giant is facing multiple actions in the EU, including a €797 million antitrust fine tied to Facebook Marketplace, as well as separate copyright, data-protection and targeted advertising cases in Spain, France, Germany, and Norway.

