Who’s to blame when the web goes wrong?
In two appeals, the Supreme Court considers social media platforms’ immunity for terrorist acts
During oral arguments in two social media cases last week, the Supreme Court justices seemed to say, if you think the internet is broken, don’t expect us to fix it.
The cases weigh social media platforms’ liability for content posted by third parties. In Gonzalez v. Google, heard Tuesday, the family of a 23-year-old American woman who died in a 2015 Islamic State (ISIS) attack in Paris, sought to hold Google liable for violating the federal Anti-Terrorism Act’s ban on aiding and abetting terrorism. Family attorney Eric Schnapper argued that YouTube (owned by Google) recommended ISIS videos to users through its algorithms, thereby aiding terror recruitment.
A lower court sided with Google in a ruling that the 9th U.S. Circuit Court of Appeals affirmed.
At issue in Gonzalez is the reach of Section 230 of the Communications Decency Act. The 1996 law allowed the internet to flourish as a free speech forum by immunizing internet service providers from liability for content posted by third parties. Without that protection, platforms such as Facebook, Google, and YouTube say the internet as we know it would disappear. Providers would have to police content strictly to stave off lawsuits and financial ruin. Maybe they would even shut down. That’s what Google and friend-of-the-court supporters argued in legal briefs and in court Tuesday.
“These are not, like, the nine greatest experts on the internet,” Justice Elena Kagan observed, eliciting laughter from a watching galley. Conservative Justices Neil Gorsuch, Clarence Thomas, and Brett Kavanaugh shared her skepticism about the court’s ability to fix what ails the web. “Are we really the right body to draw back from what had been the text and consistent understanding in courts of appeals?” asked Kavanaugh.
Factual circumstances in a second case, heard Wednesday, are no less tragic. Nawras Alassaf, a Jordanian citizen, was among 39 people killed in the January 2017 ISIS attack at an Istanbul nightclub. The family is seeking to hold Twitter liable for Alassaf’s death under the Anti-Terrorism Act. While some justices seemed more sympathetic to the family’s challenge, nearly three hours of discussion about layers of hypothetical situations showed the complexity of their task.
Unlike Gonzalez, which focuses on algorithm-generated recommendations of ISIS content, Twitter v. Taamneh goes straight to the heart of the concern in both cases, arguing Twitter aided and abetted international terrorism by failing to keep ISIS content off the platform. Twitter attorney Seth Waxman argued the social media giant regularly removed ISIS-related content and could only be held liable under the Anti-Terrorism Act if it provided substantial assistance for a specific act of terrorism.
The Biden administration supported the platforms but on a narrower basis, with Deputy Solicitor General Edwin Kneedler arguing that a defendant could be held liable under the Anti-Terrorism Act even if it did not know about a specific act.
A California district court dismissed the case in 2018, agreeing with Twitter that the family couldn’t prove the platform knowingly aided and abetted the Istanbul attack. In 2021, the family appealed the decision to the 9th Circuit, which also sided with Twitter.
Some justices seemed troubled by the prospect of letting Twitter completely off the hook. Kagan pushed back against Kneedler’s assertion that a platform that provided an account to a known terrorist differed from a bank that provided an account to one.
“It seems to be true that various kinds of social media platforms also provide very important services to terrorists,” said Kagan. “And if you know that you’re providing a very important service to terrorists, why aren’t you providing substantial assistance and just doing it knowingly?”
Thus far Congress has failed to take any action to amend Section 230 to address criticisms by both political parties. Democrats allege that the platforms amplify racist and violent content, while Republicans raise concerns that content moderators are censoring conservative views.
If this week’s arguments are any indication, Congress may be the only government body able to contain the worst excesses of the internet. Court rulings are blunt instruments for dealing with complex matters, and yet congressional inaction has left proponents of change nowhere else to go.
The Supreme Court will likely have more chances to join the fray. Critical of social media censoring of conservative views, Texas and Florida have created laws targeting content moderation. Last month, the justices asked the Justice Department to weigh in on the issues presented by the states’ laws, both currently blocked by courts.
The court is expected to issue rulings in the Section 230 cases no later than July.
If you enjoyed this article and would like to support WORLD's brand of Biblically sound journalism, click here.
I value your concise, accessible reporting. —Mary LeeSign up to receive Liberties, WORLD’s free weekly email newsletter on First Amendment freedoms.
Please wait while we load the latest comments...
Please register, subscribe, or log in to comment on this article.