A new sheriff in town
A federal court serves notice to TikTok that unlimited immunity is over
Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
In American capitalism, we rightly give private businesses extensive leeway to seek profit—but not unlimited leeway. If your company cuts corners on a faulty toaster that sparks a house fire, I can sue you for the defective product. If a railroad fails to maintain its network and a deadly accident results, it could find itself on the hook for tens or possibly hundreds of millions in damages. Even First Amendment rights are not unlimited: A newspaper that publishes falsehoods with malicious intent that destroy someone’s reputation can be held liable for libel. Only one industry enjoys immunity from these commonsense curbs on profit—the internet. At least, up until now.
In a potentially landmark decision last week, the 3rd U.S. Circuit Court of Appeals ruled in Anderson v. TikTok that it was time for a dramatic rethink of the so-called Section 230 immunity that has helped pad the massive profit margins of the tech industry. The facts of the case are sickening: A 10-year-old girl, Nylah Anderson, encountered a “blackout challenge” video on her “recommended for you” TikTok feed, which dared children to try asphyxiating themselves until they blacked out. Following the platform’s suggestion, Nylah unintentionally hanged herself. Her distraught mother, Tawainna Anderson, tried to hold TikTok liable, but a U.S. District Court rejected her suit, following a quarter-century of precedent that protected online platforms for any content they hosted—including grotesque libels, incitements to violence, and child pornography.
The 3rd Circuit panel unanimously disagreed. Drawing in part on recent Supreme Court writings from Justice Clarence Thomas, the judges concluded that by aggregating, curating, and personally recommending certain content to users, platforms were not simply hosting “third-party speech” but engaging in their own speech—just like the editor of an anthology. Two of the three judges were willing to concede that while TikTok might not be liable for merely allowing self-asphyxiation videos to circulate on its platform, the crucial problem was the platform’s recommendation of those videos to users. Given how much the business model of such companies depends on the active curation of recommended content, this ruling alone has the potential to transform the industry.
But in a lengthy and eloquent partial dissent, U.S. Circuit Judge Paul B. Matey went much further. Opening with a quotation from St. Augustine’s Confessions, he denounced TikTok’s “casual indifference to the death of a ten-year-old girl” and called its reading of Section 230 “a position that has become popular among a host of purveyors of pornography, self-mutilation, and exploitation.” From his standpoint, even if Nylah had searched for, not merely stumbled across, the “blackout challenge” videos, the platform should face liability, given that court documents showed “TikTok knew that: 1) ‘the deadly Blackout Challenge was spreading through its app,’ 2) ‘its algorithm was specifically feeding the Blackout Challenge to children,’ and 3) several children had died while attempting the Blackout Challenge after viewing videos of the Challenge on their For You Pages.”
Judge Matey pointed out that in every other communications medium in American history, legislatures and courts have distinguished between unintentional versus knowing distribution of dangerous or criminal content. While it might not make sense to penalize a telegraph operator for sending a coded message from a Mafia boss to a hit man, the situation would be different if the operator knew the meaning of the message and didn’t bother to stop it or report it. Applying conservative principles of originalism and textualism to a close reading of Section 230, Judge Matey demonstrated that Congress had never intended to do more than apply these commonly accepted legal distinctions to the new medium of the internet.
Indeed, the supreme irony of Section 230 is that it is one of the only surviving sections of a 1996 law, the Communications Decency Act, intended precisely to protect children from dangerous content online. In a questionable ruling set to be reconsidered this winter, the Supreme Court struck down most of the law as potentially too restrictive of adult speech but left in place Section 230, designed to encourage platforms to self-police and take down harmful or obscene content. In his opinion, Judge Matey explained how the clause had subsequently been misread to produce precisely the opposite effect, leaving children inundated in a media ecosystem of hard-core pornography and self-harm videos.
Since its advent three decades ago, the internet has offered untold benefits to billions of users, providing access to new sources of knowledge and new forms of productivity. But it has also done untold harm, especially to the most vulnerable among us, through a lack of commonsense regulation, as bandits, murderers, and pimps thrive in a Wild West beyond the reach of the law. The 3rd Circuit has served notice that there might be a new sheriff in town, and we can only hope that the Supreme Court follows suit and honors the original intent of Congress for Section 230.
These daily articles have become part of my steady diet. —Barbara
Sign up to receive the WORLD Opinions email newsletter each weekday for sound commentary from trusted voices.Read the Latest from WORLD Opinions
Carl R. Trueman | A former Church of England leader erases what it means to be human
Daniel R. Suhr | President-elect Trump will have an opportunity to add to his legacy of conservative judicial appointments
A.S. Ibrahim | The arrest of a terrorist sympathizer in Houston should serve as a wake-up call to our nation
Brad Littlejohn | How conservatives can work to change our culture’s hostility toward families
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.