While the court broadly sided with companies including Google and Twitter, it refused to address bigger concerns about the applicability of Section 230, a federal law shielding social media platforms from torts created by third-party users.
The Supreme Court has sided with technology companies including Google, Facebook, and Twitter, refusing to hear a lawsuit alleging that social media platforms should be held liable for an overseas terror attack that killed 39 people in 2017.
According to The Associated Press, while the justices unanimously rejected the complaint, they sidestepped the broader issues underpinning the lawsuit, including the legality of a federal law that shields social media companies from being sued over content posted by third-party users.
During the same session, the Supreme Court also returned a related complaint to lower court.
That complaint, writes The Associated Press, was filed by the family of an American college student killed in an Islamic State-coordinated terror attack in Paris in 2015.
While the lawsuit was returned to lower court, the justices’ ruling left little of the plaintiffs’ claims intact.
The Associated Press reports that the Supreme Court had initially taken up the cases to determine whether social media companies’ legal shield was too broad.
However, and somewhat unexpectedly, the justices said that it was not necessary to address this issue, as there is little evidence tying Google to the Paris attack.
“We therefore decline to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief,” the court wrote in an unsigned opinion.
Technology industry groups have already begun celebrating the ruling, even though the Supreme Court could still hear similar lawsuits in the future.
“This is a huge win for free speech,” NetChoice attorney Chris Marchese said in a statement. “The court was asked to undermine Section 230—and declined.”
“Even with the best moderation systems available, a service like Twitter alone cannot screen every single piece of user-generated content with 100% accuracy,” Marchese said. “Imposing liability on such services for harmful content that unintentionally falls through the crack would have disincentivized them from hosting any user generated content.”
POLITICO reports that the justices were “emphatic” in their rejection of the lawsuit seeking to establish liability for the 2017 nightclub attack in Turkey—effectively relieving fears that large technology companies could be held financially responsible for the spread of radical, terroristic content on their platforms.
Justice Clarence Thomas offered additional comment on the Turkey-related claim, saying that a company with imperfect moderation systems cannot be said to be effectively “aiding and abetting” terror organizations.
“The point of aiding and abetting is to impose liability on those who consciously and culpably participated in the tort at issue,” wrote Thomas, the controversial right-wing justice who has become embroiled in a potential corruption scandal. “When there is a direct nexus between the defendant’s acts and the tort, courts may more easily infer such culpable assistance. But, the more attenuated the nexus, the more courts should demand that plaintiffs should culpable participation through intentional aid that substantially furthered the tort.”