Big Tech’s Dirty Secret: Meta Let Child Predators Rack Up Violations
Remnant Recap
-
Meta ignored child safety risks: Internal testimony says offenders could violate rules repeatedly before facing suspension.
-
Evidence allegedly hidden: Plaintiffs claim Meta downplayed its own research linking teen mental health harms to platform use.
-
Profits over protection: Big Tech kept harmful systems in place while minors faced exploitation and mental health fallout.
The latest court filing against Meta is a blistering reminder of what happens when Big Tech chases profit over basic human decency. Allegations now show Meta knew predators were exploiting kids on Instagram and Facebook, yet repeat offenders were allowed to rack up violation after violation with no real consequences.
Worse, Meta is accused of misleading Congress and hiding internal research showing its platforms fuel anxiety, depression, and exploitation. This is not innovation. It is negligence wrapped in PR spin. America needs accountability, not corporations pretending they are “reimagining safety” while children pay the price for their growth-at-all-costs model.
LifeSiteNews reports:
A newly unsealed court filing alleges that Facebook and Instagram parent company Meta knew how widespread sexual and psychological exploitation of children was on its platforms but publicly downplayed the risk and would knowingly let repeat offenders remain despite more than a dozen violations.
Time magazine reported that the filing, unsealed November 21, contains testimony from former Instagram head of safety and well-being Vaishnavi Jayakumar that, upon her joining the company in 2020, she discovered it had a policy under which one “could incur 16 violations for prostitution and sexual solicitation,” but it would take the 17th before “your account would be suspended.” She claimed to have raised the issue multiple times, only to be rebuffed on the grounds that fixing the issue would be too difficult.
The 1,800-plus plaintiffs, who include families, school districts, and attorneys general across multiple states and localities, accuse Instagram, TikTok, Snapchat, and YouTube of “relentlessly pursu(ing) a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health (…) “Meta never told parents, the public, or the Districts that it doesn’t delete accounts that have engaged over 15 times in sex trafficking.”
“Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues,” argued Previn Warren, one of the plaintiffs’ lead attorneys. “Like tobacco, this is a situation where there are dangerous products that were marketed to kids. They did it anyway, because more usage meant more profits for the company.”
Meta denied the so-called “17x” rule, insisting it removes accounts immediately upon such suspicions and has proactively worked for years to protect minors on its platforms; the plaintiffs claim to have documentation corroborating Jayakumar’s allegations, and that Meta knew more than it let on or acted upon evidence of its platforms’ negative impacts on teen mental health issues such as eating disorders and suicide.
“We know parents are worried about their teens having unsafe or inappropriate experiences online, and that’s…
Photo credit: Derick Hudson/Getty Images




