![]() |
Adam Maida |
For years, giant social networks like Facebook, Twitter, and Instagram have operated under two key principles.
First, platforms can decide what content to keep online and what to remove without government oversight. Second, websites cannot be held legally responsible for most of what users post online, thus avoiding defamatory speech, extremist content, and real-world harm associated with platforms. You can avoid litigation. You can't protect your company.
Now, the Supreme Court is poised to reconsider those rules that have governed online speech since U.S. officials and courts decided to apply most regulations to the web in the 1990s. . It can lead to the most crucial reset of doctrine.
On Friday, the Supreme Court is scheduled to discuss whether to hear two cases in Texas and Florida challenging laws that prohibit online platforms from removing certain political content. We plan to hear lawsuits challenging Section 230. Section 230 is a 1996 law that protects media from liability for content posted by users.
The lawsuit could ultimately change the predominantly non-interference U.S. legal position on online speech and threaten the business of TikTok, Twitter, Snap, and Meta, which owns Facebook and Instagram. There is a risk of tipping over.
Daphne Keller, a former Google attorney who directs the program at Stanford University's Cyber Policy Center, said:
These incidents are part of a global battle over how to deal with harmful speech online. In recent years, as Facebook and other sites have attracted billions of users and become influential communication channels, the power they wield has come under increasing scrutiny. Questions are being raised about the undue influence of social networks on elections, genocide, wars, and political debates.
In several parts of the world, legislators are working to curb the impact of platforms on speech. Last year, European parliamentarians approved a rule requiring internet companies to take steps to remove illegal content and be more transparent when recommending content to people.
In the United States, where free speech is protected by the First Amendment, there needs to be more legislative action. For the past three years, lawmakers in Washington have accused tech giant chief executives of removing content, but their proposals to regulate harmful content have not been supported.
Partisanship exacerbated the stalemate. Some Republicans have accused sites like Facebook and Twitter of being censored and have pressured the platforms to leave more content. Democrats, by contrast, have argued that platforms should remove more content, such as misinformation about health.
A Supreme Court case challenging Section 230 of the Communications Decency Act could have many ramifications. Newspapers and magazines can be sued over their publications, but Section 230 protects her online platform from lawsuits for most user-submitted content. It also protects the media from lawsuits when they remove posts.
For years, judges have cited the law in dismissing her claims against Facebook, Twitter, and YouTube, asking the companies to avoid new liability for status updates, posts, and viral videos. Hooray. Hooray. Critics said the law was a jailbreak card for the tech giant.
Mary Ann Franks, a law professor at the University of Miami, said, "If they aren't going to be held accountable on the back end for the damage they contribute, they have to be as reckless as possible.".
The Supreme Court has so far refused to hear several cases challenging the law. In 2020, a court dismissed a lawsuit brought by the family of an individual who died in a terrorist attack, accusing Facebook of being responsible for promoting extremist content. He refused to listen to the case of a man who said that
But he said a Gonzalez v. Google hearing is scheduled for Feb. 21 in court. The lawsuit was filed by the family of an American who was killed in Paris in an attack by IS sympathizers. He said YouTube should not be shielded from allegations that the site supported terrorism. The lawsuit alleges that recommendations count as a form of platform-generated content and are exempt from Section 230 protections.
The next day, the court will hold a hearing on his Twitter v. Taamneh, his second lawsuit. Addresses relevant issues when platforms are legally liable to support terrorism under federal law.
Washington University law professor Eric Schnapper, one of the attorneys representing plaintiffs in both cases, said in an interview that the argument was too narrow to change the Internet broadly. "The whole system will not collapse," he said.
But Halima Delane Prado, Google's general counsel, said in an interview, "A negative ruling on this issue, narrowly defined or not, would fundamentally change how the Internet works." Told. she said. . . integral” on the web.
Tech companies are also keeping a close eye on events in Texas and Florida. After Twitter and Facebook banned President Donald J. Trump after the Jan. 6, 2021 riots at the U.S. Capitol, both states passed laws banning social networks from removing certain content. bottom. under. under. under. Texas law allows users to sue large online platforms if they delete posts for "point of view." Florida law fines platforms that permanently ban the accounts of candidates for public office in the state.
![]() |
White House briefing during President Donald J. Trump's televised address at Rose's Garden on Jan. 6, 2021. Pete Marovitch |
NetChoice and the CCIA, a group funded by technology companies including Facebook, Google, and Twitter, filed a lawsuit in 2021 to block the law.
Chris Marches, an attorney at NetChoice, said, "This is a roundabout way to punish companies that exercise First Amendment rights that others disagree with.
In Florida, a federal judge agreed with a trade group to rule that the law affects platforms' First Amendment rights, and the 11th Circuit Court of Appeals upheld most of that decision. However, the Fifth Circuit Court of Appeals upheld Texas law, rejecting "the notion that corporations have a free First Amendment right to censor what people say."
This will force the Supreme Court to intervene. Jeff Kosef, an associate professor of cybersecurity law at the United States Naval Academy, said that when federal courts present different answers to the same question, the Supreme Court often chooses to resolve disputes. says. I am here. obtain.
A spokeswoman for Florida Attorney General Ashley Moody said the ruling blocking the law "protects the public's access to information," citing documents the state has filed with the Supreme Court. A spokeswoman for Attorney General Paxton did not respond to a request for comment.
If a Supreme Court judge decides to hear the challenge, she will immediately file the case for the court's term ending in June, or for the next term from October to the summer of 2024. can wake up
"We believe the courts are now in a position to issue new judgments over the internet," Kosef said.