Of Giacomo Romos
on November 6, 2022
“The bird is released”, Elon Musk tweeted the night he completed his $44 billion purchase of Twitter.
What he didn’t say is that a series of court cases could soon clip the wings.
A self-described free speech absolutistMusk has hinted that he will relax Twitter’s content moderation rules, allow more objectionable talk to remain on the siteAnd restore some users who have been banned. Three days after reassuring advertisers that he won’t let Twitter become a “free hell for all”, demonstrated his personal freewheeling approach to speech when he tweeted (and then deleted) a link to false conspiracy theory about the husband of House Speaker Nancy Pelosi.
Musk’s takeover and anticipated overhaul of Twitter comes at an extraordinary time. Internet law may be about to enter its most dramatic transition since CompuServe and AOL. Like Georgetown legal scholar Anupam Chander he wrote, Silicon Valley has flourished in the United States largely thanks to a well-crafted legal regime. Lawmakers and courts in the late 20th century enacted several substantial reforms that allowed new technology companies to operate without fear of legal liability, just as 19th-century judges devised common law principles to promote industrial development. The legal pillars that have helped the internet grow are the same ones that would enable Musk to implement many of the reforms he has suggested. But those pillars are under threat.
Last month, the Supreme Court agreed to hear two cases testing the biggest pillar: Section 230 of the Communications Decency Act, the historic 1996 law that immunizes technology companies from civil lawsuits deriving from user-generated content they host on their platforms. Under Section 230, if a user posts libel, harassment or other forms of harmful speech (such as, for example, spreading conspiracy theories about an 82-year-old assault victim), the individual user can be sued, but the platform (with a few exceptions) cannot be.
Gonzales versus Google And Twitter in Goodbye could change that. Gonzalez asks whether Section 230 immunity disappears if a platform recommends or amplifies content that is problematic for users. Tamneh asks whether a company can be held liable for “abetting and abetting” terrorism if pro-terrorism content appears on its platform (even though the company aggressively removes most pro-terrorism talk).
Many law and technology experts were shocked when the court decided to review these cases (which are due to be heard next year). Typically, judges will not hear such cases unless the circuit courts are divided on the underlying legal issues, and there is no true circuit division here. (The lower courts that have considered the matter have been fairly uniform in their broad interpretations of Section 230). matters touched by Section 230.
So the fact that the court has accepted the cases suggests that at least some justices want to reduce section 230. One of them, Justice Clarence Thomas, has already telegraphed his view: In competitors only last year And earlier this year, questioned the law’s broad protections and called on his colleagues to look into them carefully. (I have written before about how the ideas Thomas expressed in solo opinions are increasingly gaining a majority in the new conservative court.)
Separately, two more cases are waiting in the wings. In NetChoice vs. Paxton And Moody’s vs. NetChoice, the tech industry is challenging Texas and Florida laws that limit the platforms’ authority to remove user-generated content. Politicians in those states believe tech companies are biased against politically conservative discourse and are trying to ease what they call censorship. The tech companies argue that the First Amendment (not to mention Section 230!) protects their right to make their own rules for their platforms, including banning speech that isn’t necessarily illegal but harmful, such as disinformation about elections or on COVID vaccines.
The Supreme Court has not yet decided whether to uphold the NetChoice litigation. But unlike Gonzalez And Tamneh, there is a circuit split: The United States Court of Appeals for the 5th Circuit (in an acolyte opinion of Judge Samuel Alito) upheld the Texas law, while the United States Court of Appeals United for the 11th Circuit overturned Florida’s similar law. So most likely the judges will intervene.
The result for Twitter and other social media companies is a new world of largely unknown risks. If the Supreme Court tightens Section 230, Musk can forget his commitment to lighter restraint. Almost everything Twitter does is built around content recommendations produced by complex algorithms, which in turn respond to the unpredictable behavior of human users. (The same goes for all other major social media companies. Search engines too). to remove much more content on the front end.
Should the court uphold the Texas and Florida laws, the companies will also face new takedown penalties too much content. And the conundrum could get even worse: One can imagine blue states passing their own platform regulations that directly conflict with those of red states — say, from applicant platforms to remove the same disinformation that red states insist cannot be removed.
Chander believes the ultimate loser in such a regime would be precisely what Musk professes to stand for: free speech and an open internet.
“If we impose a huge liability on platforms of the left and right,” he said, “that means those platforms will now act in a way that will drastically reduce the risks for them – and with serious consequences for our practical freedoms of speech online “.
Congress, of course, could fix this by clarifying the scope of Section 230. Its key provision, after all, is only 26 words long and 26 years old — it might be time for an update. Congress could also leverage its power under that of the Constitution supremacy clause to get ahead of any state law that conflicts with Section 230 protections. But reform proposals (from both left and right) haven’t taken off. Until they do, we all fly blind.
This column was originally published Nov. 3 in the National Journal and is owned and licensed by National Journal Group LLC.