A new sheriff in town | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

A new sheriff in town

A federal court serves notice to TikTok that unlimited immunity is over


Tawainna Anderson wipes a tear from her eye as she holds a photo of her daughter Nylah at a meeting in Washington to urge Congress to pass legislation to keep kids safe online. Associated Press / Photo by Eric Kayne / ParentsTogether Action

A new sheriff in town
You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

In American capitalism, we rightly give private businesses extensive leeway to seek profit—but not unlimited leeway. If your company cuts corners on a faulty toaster that sparks a house fire, I can sue you for the defective product. If a railroad fails to maintain its network and a deadly accident results, it could find itself on the hook for tens or possibly hundreds of millions in damages. Even First Amendment rights are not unlimited: A newspaper that publishes falsehoods with malicious intent that destroy someone’s reputation can be held liable for libel. Only one industry enjoys immunity from these commonsense curbs on profit—the internet. At least, up until now.

In a potentially landmark decision last week, the 3rd U.S. Circuit Court of Appeals ruled in Anderson v. TikTok that it was time for a dramatic rethink of the so-called Section 230 immunity that has helped pad the massive profit margins of the tech industry. The facts of the case are sickening: A 10-year-old girl, Nylah Anderson, encountered a “blackout challenge” video on her “recommended for you” TikTok feed, which dared children to try asphyxiating themselves until they blacked out. Following the platform’s suggestion, Nylah unintentionally hanged herself. Her distraught mother, Tawainna Anderson, tried to hold TikTok liable, but a U.S. District Court rejected her suit, following a quarter-century of precedent that protected online platforms for any content they hosted—including grotesque libels, incitements to violence, and child pornography.

The 3rd Circuit panel unanimously disagreed. Drawing in part on recent Supreme Court writings from Justice Clarence Thomas, the judges concluded that by aggregating, curating, and personally recommending certain content to users, platforms were not simply hosting “third-party speech” but engaging in their own speech—just like the editor of an anthology. Two of the three judges were willing to concede that while TikTok might not be liable for merely allowing self-asphyxiation videos to circulate on its platform, the crucial problem was the platform’s recommendation of those videos to users. Given how much the business model of such companies depends on the active curation of recommended content, this ruling alone has the potential to transform the industry.

But in a lengthy and eloquent partial dissent, U.S. Circuit Judge Paul B. Matey went much further. Opening with a quotation from St. Augustine’s Confessions, he denounced TikTok’s “casual indifference to the death of a ten-year-old girl” and called its reading of Section 230 “a position that has become popular among a host of purveyors of pornography, self-mutilation, and exploitation.” From his standpoint, even if Nylah had searched for, not merely stumbled across, the “blackout challenge” videos, the platform should face liability, given that court documents showed “TikTok knew that: 1) ‘the deadly Blackout Challenge was spreading through its app,’ 2) ‘its algorithm was specifically feeding the Blackout Challenge to children,’ and 3) several children had died while attempting the Blackout Challenge after viewing videos of the Challenge on their For You Pages.”

Since its advent three decades ago, the internet has offered untold benefits to billions of users, providing access to new sources of knowledge and new forms of productivity. But it has also done untold harm, especially to the most vulnerable among us, through a lack of commonsense regulation, as bandits, murderers, and pimps thrive in a Wild West beyond the reach of the law.

Judge Matey pointed out that in every other communications medium in American history, legislatures and courts have distinguished between unintentional versus knowing distribution of dangerous or criminal content. While it might not make sense to penalize a telegraph operator for sending a coded message from a Mafia boss to a hit man, the situation would be different if the operator knew the meaning of the message and didn’t bother to stop it or report it. Applying conservative principles of originalism and textualism to a close reading of Section 230, Judge Matey demonstrated that Congress had never intended to do more than apply these commonly accepted legal distinctions to the new medium of the internet.

Indeed, the supreme irony of Section 230 is that it is one of the only surviving sections of a 1996 law, the Communications Decency Act, intended precisely to protect children from dangerous content online. In a questionable ruling set to be reconsidered this winter, the Supreme Court struck down most of the law as potentially too restrictive of adult speech but left in place Section 230, designed to encourage platforms to self-police and take down harmful or obscene content. In his opinion, Judge Matey explained how the clause had subsequently been misread to produce precisely the opposite effect, leaving children inundated in a media ecosystem of hard-core pornography and self-harm videos.

Since its advent three decades ago, the internet has offered untold benefits to billions of users, providing access to new sources of knowledge and new forms of productivity. But it has also done untold harm, especially to the most vulnerable among us, through a lack of commonsense regulation, as bandits, murderers, and pimps thrive in a Wild West beyond the reach of the law. The 3rd Circuit has served notice that there might be a new sheriff in town, and we can only hope that the Supreme Court follows suit and honors the original intent of Congress for Section 230.


Brad Littlejohn

Brad (Ph.D., University of Edinburgh) is a fellow in the Evangelicals and Civic Life program at the Ethics and Public Policy Center. He founded and served for 10 years as president of The Davenant Institute and currently serves as a professor of Christian history at Davenant Hall and an adjunct professor of government at Regent University. He has published and lectured extensively in the fields of Reformation history, Christian ethics, and political theology. You can find more of his writing at Substack. He lives in Northern Virginia with his wife, Rachel, and four children.


Read the Latest from WORLD Opinions

Ted Kluck | Tuesday’s debate was clearly a “road game” for Trump

Bethel McGrew | The conflict in the Middle East has exposed inconsistencies in the party’s support for Israel

John D. Wilsey | Tucker Carlson’s interview with Darryl Cooper shows why the study of the past matters

Brad Littlejohn | Like humans, AI bots don’t always tell the truth

COMMENT BELOW

Please wait while we load the latest comments...

Comments