Anything goes
Section 230 helped make the worst parts of the internet possible. Would reversing it fix the problem?
Illustration by Taylor Callery

Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
On a Sunday afternoon, 13 parents from across the country logged on to Zoom, gearing up to condense into two minutes stories no parent wants to tell.
Each had lost a child, some as young as 14, to fentanyl poisoning. The Zoom call, held in April 2021, included three officials from Snap Inc., the parent company of the social media platform Snapchat. Each parent’s story included a drug dealer who used Snapchat to sell fake pills laced with fentanyl. Snap officials joined the call ahead of a rally the parents planned to hold outside the company’s headquarters in Santa Monica, Calif., to draw attention to what was happening on the platform.
Going into the meeting, three of the parents told me they were cautiously optimistic. They hoped Snapchat officials would hear their stories and initiate changes to combat the problem—or at least acknowledge it.
But after the parents wrapped up their stories, some said Jennifer Stout, Snap’s vice president of global public policy, made it clear she believed the families had another motive: suing the company. Well, they couldn’t, thanks to Section 230, she reportedly informed them. Because the drug dealers had posted the content and not Snap, the company couldn’t be held liable, Stout told the parents, according to the three attendees I spoke with.
Section 230, part of the 1996 Communications Decency Act, has long shielded online platforms from civil liability for user-generated content. Cast in a positive light, 26 of the statute’s roughly 1,000 words are credited with creating the internet and allowing it to flourish: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Congress enacted Section 230 before anyone could fathom the internet’s enormity—the tech industry, the social media boom, artificial intelligence, and countless innovations, opportunities, and dangers. For nearly three decades, the statute has withstood a surprising number of legal and legislative challenges, with courts paving the way for a broad interpretation of its protection for companies. But pressure is mounting to reform or repeal it amid concerns over the way it exacerbates online harms. As the statute’s staunchest defenders gear up for another fight, even they admit there’s no easy solution.

Jaime Puerta holds a poster of his son Daniel, a victim of fentanyl poisoning, during a news conference in Los Angeles. Associated Press / Photo by Damian Dovarganes
THE PARENTS I SPOKE WITH who attended the Zoom call with Snap officials said hearing Section 230 invoked moments after they told their stories felt like a gut punch.
One parent, Jaime Puerta, left the call in tears. Another, Amy Neville, whose son Alexander died of fentanyl poisoning in 2020 at age 14, said she felt angry and frustrated: “The thought of suing wasn’t even on my mind.”
Now, it is. Neville became the lead plaintiff in a 2022 civil lawsuit against Snap that now includes Puerta and 62 other parents. The suit argues that the platform’s features, such as disappearing messages and its data, geolocation, and “quick add” functions, appeal to drug dealers and make illegal activity difficult to track. Some of the parents claim that even after they reported the problem to Snapchat, the platform took months to remove drug dealers’ accounts. In some cases, those same dealers found loopholes to create new accounts, leaving a trail of more overdose deaths, according to the lawsuit.
In an emailed statement, Snap denied the suing parents’ characterization of the April 2021 Zoom call: “This would have been completely contrary to the purpose of the meeting, which was to express our deep condolences for their unimaginable loss, learn from their experiences, and work together to prevent future tragedies.”
Neville sees the suit primarily as a way to spread awareness. She’s since connected with hundreds of parents who have lost children to fentanyl poisoning. In 90% of those cases, she says, kids obtained the drug over Snapchat.
But Matthew Bergman, the attorney representing the families, told me the case also presents another opportunity to chip away at Section 230: “The way the statute has been interpreted has given rise to a level of immunity that no other company in America has.” Bergman, who founded the Social Media Victims Law Center in 2021, believes the tide is turning. “That’s been subject to some judicial scrutiny in recent years—and it’s going to be subject to a lot more,” he said.
In one 2024 case, Anderson v. TikTok Inc., the 3rd U.S. Circuit Court of Appeals ruled Section 230 did not bar claims that the video-sharing platform TikTok’s recommendation algorithm played a part in a 10-year-old user’s accidental suicide. The company opted not to take its fight to the U.S. Supreme Court, letting the lower court ruling stand.
That same year, the Supreme Court declined to hear a separate case examining the scope of Section 230 that also involved Snapchat. Justices Clarence Thomas and Neil Gorsuch dissented from the court’s decision not to hear the case. They argued the court should review whether social media platforms should be held accountable for their own misconduct. “Make no mistake about it—there is danger in delay,” Thomas wrote in the dissent. “Social media platforms have increasingly used Section 230 as a get-out-of-jail free card.”
One year prior, the Supreme Court considered the statute for the first time in a case involving claims that Google-owned YouTube aided and abetted the terrorist group ISIS by recommending videos posted by the group. In that case, the justices sidestepped a ruling that could have limited the scope of Section 230. During oral arguments, Justice Elena Kagan suggested the task of narrowing the statute’s legal protections is best left to Congress, not the nine-member court: “These are not, like, the nine greatest experts on the internet.”

Amy Neville speaks at a rally outside Snap Inc. headquarters in Santa Monica, Calif. Ringo Chiu / AFP via Getty Images
CONGRESS HAS AMENDED Section 230 just once, in 2018. That change requires internet sites to remove material that violates federal and state sex trafficking laws. Since then, lawmakers have increasingly sought to further limit the statute’s scope, with little success.
In April, U.S. Sen. Lindsey Graham, R-S.C., signaled he and Sen. Dick Durbin, D-Ill., planned to introduce a bipartisan measure to sunset Section 230. Graham and Durbin have introduced similar legislation for several years, along with separate measures to amend Section 230, to combat online child sexual exploitation. Graham’s press secretary, Taylor Reidy, told me via email there’s “nothing to add” on the status of the proposed bill.
In 2024, a separate bill, the Kids Online Safety Act, overwhelmingly cleared the Senate but failed in the House due to concerns over free speech and weakening Section 230.
When it comes to changing Section 230, “there’s a lot of heat but not much light,” said Corbin Barthold, internet policy counsel and director of appellate litigation at TechFreedom, a technology think tank. Barthold said the debate has pivoted in recent years from online censorship and disinformation concerns to protecting kids.
In April, that shift was evident when the National Center on Sexual Exploitation (NCOSE) released its annual “Dirty Dozen” list “with a twist.” The list typically highlights tech companies allegedly facilitating, enabling, or profiting from sexual abuse and exploitation. But this year’s list targeted Section 230. NCOSE featured 12 survivor stories it claimed portray the ways Section 230 has protected tech companies at the expense of abuse victims. The stories include examples of sex trafficking, sextortion, child pornography, and rape tied to social media platforms, dating apps, and other websites.
“It’s been increasingly confirmed in the last few years that the greatest enabler is Section 230 because it provides this massive liability shield,” said Christen Price, legal counsel for NCOSE.
Critics of Section 230 argue that was never the statute’s intended purpose.

With a photo of her son Devin, Bridgette Norring testifies during a February Senate Judiciary Committee hearing on “The Poisoning of America.” Tom Williams / CQ-Roll Call, Inc via Getty Images
FOR MORE THAN a half century, the Supreme Court affirmed that the First Amendment provided limited immunity to bookstores, newsstands, and other distributors of third-party content when legal claims arose from material produced by others. Courts only held companies liable when they knew or should have known the content was illegal and failed to take action. In the late 1950s, one pivotal case involved a Los Angeles bookstore owner who faced criminal charges for selling an erotic book to an undercover cop. The Supreme Court eventually ruled that penalizing the bookseller was a violation of his First Amendment rights, since he could not be expected to know what was in every book he sold.
Then came the dial-up internet. Before Section 230, if online services took the slightest measures to moderate intermediaries, they could be held liable for the content of all posts on their bulletin boards. In 1995, Prodigy, an early online service provider, learned this the hard way. Since the company decided to exercise editorial oversight over its bulletin boards, the New York Supreme Court determined it acted as a publisher and not just a distributor of content. Therefore the company would be held liable for an anonymous user’s defamatory claims about the securities investment banking firm Stratton Oakmont.
The ruling exposed a glitch. Companies like Prodigy faced a choice: take either a “hands off” approach to online content moderation that preserved their protection as distributors or act as publishers having to moderate vast amounts of content produced by their users.
That caught the attention of rookie Reps. Chris Cox and Ron Wyden as Congress worked to update the 1934 Communications Act. Cox and Wyden proposed Section 230 as a way to encourage online service providers to be “good Samaritans,” blocking and screening offensive material. That is, Section 230 gave online service providers the ability to act like publishers when they screened harmful content but still be protected like distributors from the harm caused by any content they did not remove. That allowed these companies to set their own rules to govern their communities and to determine for themselves what content to block—or not.
The logic behind the statute was that the First Amendment “did not adequately protect large online platforms that processed vast amounts of online content,” wrote Jeff Kosseff in his 2019 book, The Twenty-Six Words That Created the Internet.
As Cox and Wyden sought support for Section 230, Sen. James Exon, a Democrat from Nebraska, simultaneously stoked fear about minors accessing internet porn. In the halls of Congress, Exon circulated a binder known as the “Blue Book.” It had a warning label on the cover and included pornographic pictures Exon claimed were easily accessible online, even by children. Exon’s anti-indecency bill sought to address that problem by imposing criminal penalties on internet users and service providers distributing obscene and indecent material that could be viewed by minors.
Exon’s measure garnered considerably more attention than Section 230, according to Kosseff. But shortly after the Exon bill passed as part of the Communications Decency Act, the Supreme Court struck it down. In the 1997 case Reno v. ACLU, the court ruled the anti-indecency provisions were unconstitutional and violated the First Amendment’s guarantee of free speech.
Section 230 remained intact—and Exon’s blue binder proved only a small glimpse into the cesspool of dangers that would soon proliferate on the internet. Lawmakers believed that under Section 230, companies would naturally moderate content on their own. That’s obviously not what happened, and no one could have imagined the implications.
“The worst fears of lawmakers who wanted those protections then have been eclipsed—the internet is much worse than they ever imagined,” said Mary Graw Leary, a law professor at the Catholic University of America. “If you had said, ‘We’ll have Pornhub, we’ll have OnlyFans … we’ll knowingly have child pornography and … no attempts to age verify,’ Congress would have said that was crazy.”
The result: Social media platforms are only as safe as the public pressures them into making their sites. Rather than getting rid of Section 230, TechFreedom’s Barthold described his desired outcome: “That legislators remain perpetually angry and groups and the public are always making noise, such that large social media platforms … realize they’re being closely scrutinized and it’s very important they’re seen acting responsibly.”
Some critics of Section 230 want to add a “bad Samaritan carve-out” to ensure that an internet company knowingly promoting or facilitating criminal content or behavior can’t hide behind Section 230 immunity. “Fundamentally, the world we’re seeking is one where these tech companies and platforms are treated like every other industry ... accountable for actions they engage in that cause harm, and actions that are criminal,” Leary said.
EVEN THE STATUTE’S fiercest supporters acknowledge the internet is inundated with harmful content. And it’s only becoming harder to moderate. But they argue Section 230 is the wrong target—and the trade-off would stifle free speech online and dramatically alter the internet.
“Free speech is a mess, but it’s way better than the alternative,” Dave Willner, a trust and safety expert who has worked with Facebook, Airbnb, and Open AI, said on a recent episode of Otherwise Objectionable, a podcast aimed at defending Section 230.
The alternative, according to Ari Cohn, legal counsel for tech policy at the Foundation for Individual Rights and Expression, would be chaos. Without Section 230, platforms would face increased liability risks for any attempts at moderation, leading to costly and prolonged legal battles. Big Tech companies could absorb the costs. But smaller players in the internet pool, such as blogs, nonprofits, and newer social media platforms, would more likely bear the brunt of the consequences and lose their voice in society, he said.
And given the scale of social media, “it’s actually impossible to do content moderation perfectly. It’s almost impossible to do it even approaching perfection,” Cohn added.
Despite internet advancements, Cohn argues the underlying principle of Section 230 remains: “People who are responsible for creating harmful content are the ones who should be held liable for any harmful effects of the content.” Section 230 includes an exemption for federal criminal law, so if there’s evidence platforms are knowingly allowing criminal child sexual abuse material, for example, “that really just means you need to get on federal prosecutors and police to do their jobs,” Cohn told me.
That’s hard for the parents suing Snapchat to swallow. On the Zoom call with Snap officials, Bridgette Norring of Hastings, Minn., pushed back when she says Stout suggested parents should have monitored their kids’ online activity better. “My son was 19. How do you monitor a 19-year-old?” she asked.
At the time of the Zoom meeting, Norring says, one of her late son’s alleged drug dealers was still operating on the app. At one point, she flagged several alleged dealers using Snapchat’s in-app reporting mechanism. The accounts disappeared, but then the dealers came back, using different accounts. Norring later learned that reporting her son’s alleged dealers on the app hampered a police investigation involving her son’s death. She says when Snapchat erased the dealers’ accounts and data, it made it harder for police to track them.
Snap claimed in its statement to me that it has taken measures to address problems on the app that make it unsafe for vulnerable teens. On July 17, Snap announced support for a new congressional bill named after Norring’s son, Devin, and another fentanyl poisoning victim. It would require social media companies to alert federal law enforcement when they detect illicit drug activity on their platforms. Norring is hopeful that the legislation and the lawsuit she joined against Snap force change: “They have proven for far too long they cannot regulate themselves.”
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.