Harboring the evil amongst us | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Harboring the evil amongst us

Instagram gets away with dragging its feet in dealing with child pornography


Instagram’s logo appears on a phone. Associated Press/Photo by Amr Alfiky

Harboring the evil amongst us

In a devastating recent exposé in the Wall Street Journal, investigators revealed that Instagram, one of the world’s most popular social media platforms, has been pervasively and persistently enabling the distribution of child pornography. Explicit hashtags such as #preteensex made it effortless for predators to search for users offering illicit content, and if a user showed even fleeting interest in pedophilic material, the platform’s algorithms helpfully offered to connect the user with other accounts profiling such content. Many such accounts used their Instagram profiles to post “menus” of explicit content for sale or available on other websites.

One might be tempted to chalk all this up merely to poor oversight on Instagram’s part. After all, how can even a huge social media platform be expected to police 1.3 billlion users, each of whom posts an average of one image per day? And yet the WSJ investigation revealed that Instagram knew what was going on, and still permitted it.

Stanford Internet Observatory researchers captured a shocking screenshot of a pop-up warning that Instagram displayed, advising that “These results may contain images of child sexual abuse,” and noting that such images caused “extreme harm to children”—followed by two options: “Get Resources” or “See Results Anyway.” When contacted by the WSJ, Instagram “declined to say why it had offered that option.”

Our first reaction to such revelations is shock. Let’s not allow our second reaction to become a shrug of resignation. Millions of Americans boycotted Bud Light because it put pictures of a trans activist on its beer cans. Thus far, few have deleted their Instagram accounts because it regularly gives a platform to sexualized pictures of children.

Optimists might assume that Instagram, now outed, will lose no time in cracking down on the bad apples exploiting its platform. Perhaps we should believe the company’s claims that it is doing its best and just can’t keep up. But distributing sexual content produced by minors is not a bug in the software, but a feature. Many reports have documented the ways that Instagram’s most active users, teenage girls, are lured and pressured by the platform into posting sexualized selfies, with disastrous consequences for their mental health.

In every other domain besides the internet, courts have recognized “product liability.”

Meanwhile, over-18 “influencers” routinely use the platform as a primary gateway to build a following for the pornography they generate on sites like OnlyFans. With such a teeming jungle of sexual content on its servers, it’s not surprising that Instagram should throw up its hands and say, How can you expect us to separate the merely suggestive from the illegal?

This crisis is a predictable result of our society’s embrace of a minimalist “harm principle”—the idea that consenting individuals should be free to do whatever they want so long as no one is directly and physically harmed. We somewhat arbitrarily declare that those under 18 are incapable of consent, and so deserve some special protection, but anyone else is fair game. Unfortunately, it doesn’t take long to discover that once you adopt such a posture, even the few remaining red lines become almost unenforceable.

The crisis on our social media platforms is also a result of judicial malpractice. When the internet was first taking off in the 1990s, Section 230 of the Communications Decency Act promised that websites would not have publisher liability for content their users posted on the sites: in other words, if I slandered you on my Facebook feed, Facebook could not be sued the same way the New York Times could be for publishing slander. Subsequently, however, courts interpreted and applied Section 230 to grant online platforms immunity from other kinds of liability never envisioned or described in Section 230.

In a scathing dissection of this trend, Justice Clarence Thomas observed that these decisions have “discarded the longstanding distinction between ‘publisher’ and ‘distributor’ liability.” A magazine stand that knowingly sells child pornography does not have the same legal liability as the publisher, but it is certainly not off the hook. In every other domain besides the internet, too, courts have recognized “product liability,” punishing companies for designing and selling products that they knew would harm their users. (One smaller social media platform, Omegle, is currently facing a product liability lawsuit in the United Kingdom for its role in facilitating child sexual abuse.)

Right now, Instagram drags its feet over blocking and removing child porn because it can get away with it. Policing such content would take a lot of work and cost a lot of money. It might also mean de-sexualizing the site more broadly in a way that dramatically reduces traffic. The Wall Street Journal’s exposé is a good start, but it’s high time to push for real legal accountability when it comes to social media.


Brad Littlejohn

Brad Littlejohn (Ph.D., University of Edinburgh) is a fellow in the Evangelicals and Civic Life program at the Ethics and Public Policy Center. He founded and served for ten years as president of The Davenant Institute, and has taught for several institutions, including Moody Bible Institute–Spokane, Bethlehem College and Seminary, and Patrick Henry College. He is recognized as a leading scholar of the English theologian Richard Hooker and has published and lectured extensively in the fields of Reformation history, Christian ethics, and political theology. He lives in Landrum, S.C., with his wife, Rachel, and four children.


Read the Latest from WORLD Opinions

Brad Littlejohn | Many people want to conserve the only status quo that they know

Anne Kennedy | The controversy at NPR reveals deep confusion about truth

Jerry Bowyer | Mixing PepsiCo and pedophilia is a branding disaster

R. Albert Mohler Jr. | This is what happens when the ideological left controls higher education

COMMENT BELOW

Please wait while we load the latest comments...

Comments