LegalReader.com  ·  Legal News, Analysis, & Commentary

Lawsuits & Litigation

Federal Judge Gives Green Light to Social Media Addiction Lawsuit


— November 16, 2023

U.S. District Judge Yvonne Gonzalez Rogers noted that many popular social media platforms seem designed to encourage “predatory” behavior.


A federal judge has refused to dismiss a series of lawsuits filed against social media companies, most of which accuse platforms like Snapchat and TikTok of employing intentionally addictive algorithms that damage children’s mental health.

According to Reuters, U.S. District Judge Yvonne Gonzalez Rogers issued her ruling against Alphabet, Meta Platforms, ByteDance, and Snapchat earlier this week.

The decision affects hundreds of lawsuits filed on behalf of individual children, all of whom say that they have suffered physical, mental, and emotional injuries as a result of the defendant companies’ allegedly negligent policies and procedures.

As LegalReader.com has reported before, the complaints seek compensatory damages and an injunction against purportedly predatory business practices.

“Today’s decision is a significant victory for the families that have been harmed by the dangers of social media,” said attorneys Lexi Hazam, Previn Warren, and Chris Seeger, all lead counsel in claims against the companies. “The Court’s ruling repudiates Big Tech’s overbroad and incorrect claim that Section 230 or the First Amendment should grant them blanket immunity for the harm they cause to their users.”

The New York Post notes that the defendant companies had sought a dismissal on grounds that they are shielded from liability by Section 230 of Title 47 of the U.S. Communications Decency Act of 1996, which protects technology companies from claims involving third-party actions.

However, Rogers addressed these claims in her ruling, saying that they are simply inapplicable.

A gavel. Image via Wikimedia Commons via Flickr/user: Brian Turner. (CCA-BY-2.0).

“Nothing in Section 230 or existing case law indicates that Section 230 only applies to publishing where a defendant’s only intent is to convey or curate information,” Rogers wrote. “To hold otherwise would essentially be to hold that any website that generates revenue by maintaining the interest of users and publishes content with the intent of meeting this goal, would no longer be entitled to Section 230 immunity.”

Rogers said that the plaint4iffs’ claims went beyond allegations of inappropriate third-party content, and that none of the social media defendants were able to make a compelling argument for immunity.

In her 52-page ruling, Rogers observed that many social media products including features seemingly designed to “enable coercive, predatory behavior toward children,” including “limitations on content length,” “notifications […] to draw them back to their respective platforms,” and “engage-based algorithms.”

“Defendants either do not require users to enter their age upon sign-up or do not have effective age-verification for users, even though such technology is readily available and, in some instances, used by defendants in other contexts,” she said.

Jose Castenada, a spokesperson for Alphabet subsidiary Google, told different media outlets that “the allegations in these complaints are simply not true.”

“Protecting kids across our platforms has always been core to our work,” he said. “In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.”

Sources

Big Tech loses bid to toss lawsuits alleging social media platforms harmed children

Social media companies must face youth addiction lawsuits, US judge rules

Social media giants must face child safety lawsuits, judge rules

Join the conversation!