The Supreme Court dodged a legal landmine Thursday by saying it didn’t need to reinterpret Big Tech’s most crucial legal liability shield, ruling in favor of Google and Twitter in a pair of terrorism lawsuits lodged against them. The court’s decision means Section 230 of the Communications Decency Act— a 1996 provision that prevents platforms from being held liable for content generated by its users—will remain unchanged.
Section 230 supporters feared a ruling limiting its scope would fundamentally alter the types of content social media companies are held legally liable for. Such a ruling could force platforms like Facebook and YouTube to re-examine the ways they use recommendation algorithms to serve users content. In short, the absence of any changes to 230 should come as a huge relief to tech companies hosting content on the internet.
The court dismissed the first case, Gonzalez v. Google, and issued a brief three-page unsigned opinion saying it declined to weigh in on 230 because the basis for the case was simply too weak. In that case, the parents of a teenager killed by ISIS claimed Google aids and abets terrorism when it promotes terrorist content through its YouTube recommendation algorithm.
“We decline to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief,” the court ruled.
The court similarly ruled unanimously in favor of tech companies in the separate but related case Twitter v. Taamneh which claimed Twitter aided and abetted terrorism when it failed to sufficiently take down ISIS content on the platform following a 2017 attack.
In a statement sent to Gizmodo, Google welcomed the ruling and said it should come as a relief to creators expressing themselves online.
“Countless companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result,” Google General Counsel Halimah DeLaine Prado said. Twitter responded to Gizmodo with a poop emoji.
Combined, the two rulings represent major wins for the tech industry, which has relied on 230 protections to grow since they were first introduced nearly 30 years ago. The ruling also highlights the court’s nervousness around amending the provision that have come to define the Internet. That uncertainty was on full display during oral arguments for the case earlier this year
“These are not like the nine greatest experts on the internet,” Justice Elena Kagan said
“We’re a court. We really don’t know about these things.”
Tech industry groups like NetChoice applauded the court’s ruling which they described as a “huge win” for freedom of speech and expression on the internet. It’s also a win for social media companeis interested in moderating content on their platform without the constant threat of a looming lawsuit. In a statement, NetChoice Litigation Center Director Chris Marchese said a whittling down of Section 230 protections would make tech companies even less equipped to moderate potentially harmful content.
“With billions of pieces of content added to the internet every day, content moderation is an imperfect—but vital—tool in keeping users safe and the internet functioning,” Marchese said. “Imposing liability on such services for harmful content that unintentionally falls through the crack would have disincentivized them from hosting any user-generated content.”
Tech companies and industry groups aren’t the only ones welcoming the court’s ruling. Civil liberty and privacy organizations like the ACLU, Electronic Frontier Foundation, and the Knight First Amendment Institute, all issued statements praising the court’s decision. Though child safety groups and a growing cohort of lawmakers have railed against 230 in recent years for allegedly making it difficult to hold tech companies accountable for amplifying harmful misinformation, supporters of the provision say a sudden reversal could have a chilling effect on internet speech.
“With this decision, free speech online lives to fight another day,” ACLU National Security Project Deputy Director Patrick Toomey said.
Update May 18, 12:44 P.M. EST: Added statements from Google, NetChoice, and the ACLU.