The filing provides additional evidence from discovery, the evidence-sharing stage of litigation that precedes trial. Meta’s own researchers, for instance, seemed to concede that the company’s policies were bad for young users, with one worker likening Instagram’s algorithmic practices to street-level drug dealing.
A recently-filed set of legal documents indicates that Meta, the owner of Instagram and Facebook, adopted an extraordinarily lenient “17x” policy toward accounts suspected to belong to sex traffickers, letting them share content related to solicitation and prostitution up to 16 times before being suspended upon the 17th and final “strike.”
According to The Mercury News, the description of the alleged sexual content policy is described in a court filing for plaintiffs in an ongoing lawsuit against Meta and several other companies, including the Google-owned YouTube, Snapchat, and TikTok.
Many of the plaintiffs, ranging from parents, students, and teachers to state-level agencies, initially filed separate claims, which have since been consolidated through multidistrict litigation.
“Despite earning billions of dollars in annual revenue — and its leader being one of the richest people in the world — Meta simply refused to invest resources in keeping kids safe,” the plaintiffs said in documents submitted Friday to U.S. District Court in Oakland, California.
The filing alleges that Meta made “outright lies” about its products’ potential dangers, thereby preventing “even the most vigilant administrators, teachers, parents, and students from understanding and heading off the dangers inherent to Facebook and Instagram.”

Attorneys cited evidence from internal company communications and research reports, as well as sworn depositions from current and former employees. The Mercury Press News notes that the filing contains extensive references to evidence that is not yet available to the public or the press and, as such, cannot be readily verified.
One report detailed in the filing, though, claims that Instagram’s account-recommendation feature shared the profiles of at least 2 million children with adults seeking to groom children. In total, more than 1 million “potentially inappropriate adults” were similarly recommended to teenage users in 2022.
Facebook’s similar recommendation feature was, according to a Meta employee, “responsible for 80% of violating adult/minor connections.”
The filing provides additional evidence from discovery, the evidence-sharing stage of litigation that precedes trial. Meta’s own researchers, for instance, seemed to concede that the company’s policies were bad for young users, with one worker likening Instagram’s algorithmic practices to street-level drug dealing.
“We’re basically pushers,” the employee allegedly said. “Teens are hooked despite how it makes them feel.”
The recently-submitted court documents also include testimony from Vaishnavi Jayakumar, Instagram’s former head of safety and well-being, who described Meta’s alleged “17x” policy.
“You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” she said. “By any measure across the industry, [it was] a very, very high strike threshold.”
Jayakumar, the filing indicates, was unhappy with Meta’s approach toward safety. Instagram, for instance, has a zero-tolerance policy for child sexual abuse material, yet offered no easy way for users to reported potentially offending content. Jayakumar raised the issue multiple times, but was told that it would be too difficult to address. At the same time, Jayakumar noted, Instagram made reporting other offenses—like spam, intellectual property violations, and promotion of firearms—relatively simple.
The lawsuit alleges that Meta has long been aware of these shortcomings. In 2020, while trying to determine how enhanced privacy controls and teen-safety policies could affect growth, Meta received recommendations from separate policy, legal, communications, privacy, and well-being teams, all suggesting that it would be better to make teen accounts private by default.
This suggestion was later put on hold, potentially due to concerns that limiting “unwanted interactions” could, in fact, lead to decreased engagement from young users.
The recommendation was only implemented in 2024, when enhanced privacy settings became default for all teen accounts.
Sources
Court Filings Allege Meta Downplayed Risks to Children and Misled the Public
Lawsuit: Meta allowed sex-trafficking posts on Instagram as it put profit over kids’ safety


Join the conversation!