In his ruling, U.S. District Judge Paul Diamond found that TikTok’s conduct was protected by Section 230 of the Communications Decency Act.
A federal judge has dismissed a wrongful death lawsuit against TikTok, which claimed that the popular social media platform was responsible for a 10-year-old girl’s death.
According to NBC News, the complaint was filed by Taiwanna Anderson of Pennsylvania.
In her lawsuit, Anderson accused TikTok and its parent company, ByteDance, of facilitating the death of her young daughter, Nylah Anderson.
Nylah was found unconscious, hanging from a purse strap, after participating in a so-called “blackout challenge,” which encourages participants to choke themselves under they pass out.
The challenge, said Anderson’s lawsuit, had appeared on Nylah’s featured “For You” page.
Anderson attempted to perform CPR on her unconscious daughter before first responders arrived and took the girl to a hospital.
However, Nylah Anderson passed away after spending several days in a pediatric intensive care unit.
A medical examiner’s report suggested that Nylah had attempted to free herself before losing consciousness.
On Tuesday, U.S. District Judge Paul Diamond of Philadelphia granted TikTok’s motion to dismiss the wrongful death lawsuit, finding that the social media platform is protected by Section 230 of the Communications Decency Act, which protects internet platforms from being held liable for content posted or propagated by third-party users.
“Defendants did not create the challenge; rather, they made it readily accessible on their site,” Diamond wrote in his ruling. “Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be interested in it. In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability.”
Jeffrey Goodman, an attorney representing the Anderson family, told NBC News that “the Anderson family will continue to fight to make social media safe so that no other child is killed by the reckless behavior of the social media industry.”
“The federal Communications Decency Act (CDA) was never intended to allow social media companies to send dangerous content to children,” Goodman said in a statement.
NBC News notes that Anderson’s lawsuit had not intended to hold TikTok liable as the publisher of the video. Instead, the complaint sought to hold the company accountable for its “dangerously defective” algorithm, as well as “negligent” content.
“The TikTok Defendants’ app and algorithm have created an environment in which TikTok ‘challenges’ are widely promoted and result in maximum user engagement and Participation,” the lawsuit said.
However, Diamond’s ruling found that such arguments are, nonetheless, “inextricably linked” to the way that TikTok publishes third-party content, and that the company therefore cannot be held liable for damages under Section 230.