Brad Littlejohn: Accountability for social media | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Brad Littlejohn: Accountability for social media

0:00

WORLD Radio - Brad Littlejohn: Accountability for social media

A panel of judges rules that TikTok is liable for the death of a child


Tawainna Anderson (left) holds a photo of her daughter Nylah, Oct. 15, 2022. Associated Press Images for ParentsTogether Action/Photo by Eric Kayne

LINDSAY MAST, HOST: Today is Wednesday, September 11th. Good morning! This is The World and Everything in It from listener-supported WORLD Radio. I’m Lindsay Mast.

NICK EICHER, HOST: And I’m Nick Eicher. Up next, holding social media companies accountable for content harmful to children.

Here’s World Opinions Commentator Brad Littlejohn.

BRAD LITTLEJOHN: In American capitalism, we rightly give private businesses extensive leeway to seek profit—but not unlimited leeway. If your company cuts corners on a faulty toaster that sparks a house fire, I can sue you for the defective product. Even First Amendment rights are not unlimited: A newspaper that publishes falsehoods with malicious intent that destroy someone’s reputation can be held liable for libel. Only one industry enjoys immunity from these commonsense curbs on profit—the internet. At least, up until now.

In a potentially landmark recent decision, the 3rd U.S. Circuit Court of Appeals issued a ruling in Anderson v. TikTok. The facts of the case are sickening: A 10-year-old girl, Nylah Anderson, ended up trying something dangerous that the platform recommended that she view. After mimicking what she saw, she was killed. Her distraught mother, Tawainna Anderson, tried to hold TikTok liable, but a U.S. District Court rejected her suit. In so doing, the court followed a quarter-century of precedent that protects online platforms for any content they host—including the worst offenses of libel, violence, and pornography.

The 3rd Circuit panel unanimously rejected the lower court’s ruling. Drawing in part on recent Supreme Court writings from Justice Clarence Thomas, the judges concluded that by recommending certain content to users, platforms were not simply hosting “third-party speech” but engaging in their own speech—just like the editor of an anthology might. Two of the three judges were willing to concede the crucial problem was the platform’s recommendation of those videos to users. 

A third judge, Paul B. Matey, wrote a separate opinion saying the court should have gone further. Opening with a rhetorical flourish that included a quotation from St. Augustine’s Confessions, he denounced TikTok’s “casual indifference to the death of a ten-year-old girl.” He then criticized its reading of Section 230 of the 1996 law, the Communications Decency Act, designed to protect children from dangerous content online. From his standpoint, even if Nylah had searched for, not merely stumbled across, the videos in question, the platform should still face liability since it knew that children had died after viewing the content. But it let the videos continue to circulate.

Judge Matey also pointed out that in every other communications medium in American history, legislatures and courts have distinguished between unintentional versus knowing distribution of dangerous or criminal content. Indeed, the supreme irony of Section 230 is that it is one of the only surviving sections of a 1996 law, the Communications Decency Act, designed to protect children from dangerous content online. In a questionable 1997 ruling, the Supreme Court struck down most of the law as potentially too restrictive of adult speech. But it left in place Section 230, designed to encourage platforms to self-police and to take down harmful or obscene content. In his opinion, Judge Matey explained how the clause had subsequently been misread to leave children inundated in a media ecosystem of hard-core pornography and self-harm videos.

Since its advent three decades ago, the internet has offered untold benefits to billions of users, providing access to new sources of knowledge and new forms of productivity. But it has also done untold harm through a lack of common sense regulation, leaving behind a digital Wild West beyond the reach of the law. The 3rd Circuit has served notice that there might be a new sheriff in town, and we can only hope that the Supreme Court follows suit and honors the original intent of Congress for Section 230.

I’m Brad Littlejohn.


WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.

COMMENT BELOW

Please wait while we load the latest comments...

Comments