Harboring the evil amongst us
Instagram gets away with dragging its feet in dealing with child pornography
In a devastating recent exposé in the Wall Street Journal, investigators revealed that Instagram, one of the world’s most popular social media platforms, has been pervasively and persistently enabling the distribution of child pornography. Explicit hashtags such as #preteensex made it effortless for predators to search for users offering illicit content, and if a user showed even fleeting interest in pedophilic material, the platform’s algorithms helpfully offered to connect the user with other accounts profiling such content. Many such accounts used their Instagram profiles to post “menus” of explicit content for sale or available on other websites.
One might be tempted to chalk all this up merely to poor oversight on Instagram’s part. After all, how can even a huge social media platform be expected to police 1.3 billlion users, each of whom posts an average of one image per day? And yet the WSJ investigation revealed that Instagram knew what was going on, and still permitted it.
Stanford Internet Observatory researchers captured a shocking screenshot of a pop-up warning that Instagram displayed, advising that “These results may contain images of child sexual abuse,” and noting that such images caused “extreme harm to children”—followed by two options: “Get Resources” or “See Results Anyway.” When contacted by the WSJ, Instagram “declined to say why it had offered that option.”
Our first reaction to such revelations is shock. Let’s not allow our second reaction to become a shrug of resignation. Millions of Americans boycotted Bud Light because it put pictures of a trans activist on its beer cans. Thus far, few have deleted their Instagram accounts because it regularly gives a platform to sexualized pictures of children.
Optimists might assume that Instagram, now outed, will lose no time in cracking down on the bad apples exploiting its platform. Perhaps we should believe the company’s claims that it is doing its best and just can’t keep up. But distributing sexual content produced by minors is not a bug in the software, but a feature. Many reports have documented the ways that Instagram’s most active users, teenage girls, are lured and pressured by the platform into posting sexualized selfies, with disastrous consequences for their mental health.
Meanwhile, over-18 “influencers” routinely use the platform as a primary gateway to build a following for the pornography they generate on sites like OnlyFans. With such a teeming jungle of sexual content on its servers, it’s not surprising that Instagram should throw up its hands and say, How can you expect us to separate the merely suggestive from the illegal?
This crisis is a predictable result of our society’s embrace of a minimalist “harm principle”—the idea that consenting individuals should be free to do whatever they want so long as no one is directly and physically harmed. We somewhat arbitrarily declare that those under 18 are incapable of consent, and so deserve some special protection, but anyone else is fair game. Unfortunately, it doesn’t take long to discover that once you adopt such a posture, even the few remaining red lines become almost unenforceable.
The crisis on our social media platforms is also a result of judicial malpractice. When the internet was first taking off in the 1990s, Section 230 of the Communications Decency Act promised that websites would not have publisher liability for content their users posted on the sites: in other words, if I slandered you on my Facebook feed, Facebook could not be sued the same way the New York Times could be for publishing slander. Subsequently, however, courts interpreted and applied Section 230 to grant online platforms immunity from other kinds of liability never envisioned or described in Section 230.
In a scathing dissection of this trend, Justice Clarence Thomas observed that these decisions have “discarded the longstanding distinction between ‘publisher’ and ‘distributor’ liability.” A magazine stand that knowingly sells child pornography does not have the same legal liability as the publisher, but it is certainly not off the hook. In every other domain besides the internet, too, courts have recognized “product liability,” punishing companies for designing and selling products that they knew would harm their users. (One smaller social media platform, Omegle, is currently facing a product liability lawsuit in the United Kingdom for its role in facilitating child sexual abuse.)
Right now, Instagram drags its feet over blocking and removing child porn because it can get away with it. Policing such content would take a lot of work and cost a lot of money. It might also mean de-sexualizing the site more broadly in a way that dramatically reduces traffic. The Wall Street Journal’s exposé is a good start, but it’s high time to push for real legal accountability when it comes to social media.
These daily articles have become part of my steady diet. —Barbara
Sign up to receive the WORLD Opinions email newsletter each weekday for sound commentary from trusted voices.Read the Latest from WORLD Opinions
Ray Hacke | Will forfeits finally send the message that male athletes don’t belong in girls and women’s sports?
Marc LiVecche | The tension found in carrying out these competing duties is the focus of the film Bonhoeffer
Joe Rigney | C.S. Lewis’ That Hideous Strength is still relevant today
Carl R. Trueman | A former Church of England leader erases what it means to be human
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.