Tech legislation 2.0
Congress should end Big Tech’s special privileges on product liability
Userba011d64_201 / iStock via Getty Images Plus

Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
In recent years, many conservatives have awakened to the threat of Big Tech censorship, as politically incorrect viewpoints are mysteriously flagged, deleted, or suppressed by hostile algorithms. What many have failed to realize, however, is the even greater threat posed by Big Tech’s non-censorship. In 2021, a young man sued Twitter (now X) for allowing sexually explicit images of him taken when he was just 13 years old to circulate publicly on their platform. Incredibly, when he first reported the images to Twitter, they responded “We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” Even more incredibly, when he filed suit in federal court, the case was dismissed.
His story is one of hundreds of similar stories, involving leading social media platforms like Facebook, Instagram, Reddit, and Snapchat, and even sleepy websites like Craiglist. In each case, the website was knowingly complicit in one of the worst crimes our sexually permissive society still recognizes—sharing child pornography or facilitating child sex trafficking. In each case, the website profited from the behavior and refused to change its ways. In each case, lawsuits against the company were dismissed. All because of Section 230.
Each year, the National Center on Sexual Exploitation publishes a “Dirty Dozen” list of the worst corporate offenders for facilitating and profiting from sexual abuse. But with more and more companies escaping basic legal accountability, this year their list instead highlighted twelve survivors denied justice because of this distorted and obsolete provision of the 1996 Communications Decency Act. Originally part of a law specifically designed to protect kids online in the early days of the internet, Section 230 found itself orphaned when the Supreme Court struck down the rest of the CDA as an infringement of “free speech.” Like many orphans deprived of a good home, Section 230 turned to a life of crime. During the past three decades, aided and abetted by tech lobbyists and well-heeled law firms, it has evolved into almost a blank check of sovereign immunity for any and every tech company.
Back in 1996, Congress’s goal was to create an internet where platforms were strongly incentivized to remove inappropriate or illicit content that users posted on them but would not face crippling legal liabilities if something slipped through their filters. Once the rest of CDA was struck down, however, the incentives for good behavior were gone, and companies decided to prioritize profits over people, investing only the bare minimum into content moderation.
Instead, they invested in content amplification, through sophisticated new algorithms that could increase user “engagement” by matching people up with “more content you’ll like.” This included matching 37-year-old pedophiles up with young teens ripe for exploitation. Only by the grossest stretch of legal reasoning could companies claim immunity for actions taken by their own algorithms, but until recently, such gross stretches have been the norm.
Recently, that has at last begun to change, with courts beginning to side occasionally with a growing throng of victims. Congress, too, has taken action, passing the TAKEITDOWN Act (unanimously in the Senate, 409-2 in the House), which will require online platforms to remove non-consensual sexually explicit images within 48 hours of notification. It’s sad that it took this long for such a commonsense law to pass—for many years, platforms had to respond right away if Disney told them that one of its copyrighted cartoon characters appeared on their site, but they could ignore a teen girl notifying them of videos of her rape. But it’s certainly a step in the right direction, and crucially, the Act includes AI “deepfakes,” which have been used to harass many women with simulated sexual images of them.
The advent of such AI threats online underscores just how quickly legislators are going to have to move to play catch-up with emerging technology. Last week, the Wall Street Journal reported that Meta’s own chatbots would engage minors in sexually explicit conversations, while the popular platform Character.AI faces a lawsuit for a chatbot that encouraged a 14-year-old boy to commit suicide. It should be obvious that Section 230 immunity—meant to protect companies from liability for content uploaded by other humans out of their control—should not apply to companies’ own chatbots, but that hasn’t stopped tech companies from trying to make the argument that it does.
Ultimately, the best solution is to repeal Section 230 and replace it with legislation that treats Big Tech the same as other industries when it comes to product liability. Although the companies will complain that this is heavy-handed government regulation, the reality is that it is the most free-market approach: by allowing private citizens to sue in court for clear harms caused by these products and platforms, it will unleash market incentives for companies to invest in safer design and self-regulation. With the hearts and minds of our children on the line, the stakes couldn’t be higher.

These daily articles have become part of my steady diet. —Barbara
Sign up to receive the WORLD Opinions email newsletter each weekday for sound commentary from trusted voices.Read the Latest from WORLD Opinions
A.S. Ibrahim | Faced with overwhelming force, the Yemeni terrorists press the pause button ... for now
R. Albert Mohler Jr. | What does the election of Pope Leo XIV mean?
Eric Patterson | Americans widely misunderstand both the beginning and the end of the Vietnam War
Ted Kluck | A field guide for the unimpressive in this time of celebration
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.