Forbidden fruit
Apple must avoid the temptation of child pornography
Apple CEO Tim Cook speaks at a conference in Cupertino, Calif., on June 5, 2023. Josh Edelson / AFP via Getty Images

Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
We are currently in the early stage of the annual shareholder meeting season, with both Apple and John Deere putting ballots before shareholders. Christians have generally been unaware of the authority they hold as shareholders to force change in companies rather than just complain about them. Just as citizens have a right to elect (or not) representatives in government and to vote on ballot questions, so shareholders have rights with respect to the companies they own. The left knows that. It has known it for decades. That’s how things got so bad.
But now investors such as David Bahnsen and Christian ministries such as American Family Association (AFA), with the assistance of Alliance Defending Freedom and my own Bowyer Research, are setting the agenda rather than reacting to the left’s agenda.
At the top of the agenda at the Apple meeting is a proposal from shareholder AFA, which holds the company accountable for what certainly appears to be a lax attitude towards child sexual abuse material (aka CSAM).
In messages unearthed during the company’s 2021 legal battle with Epic Games, Apple Fraud Engineering Algorithms and Risk head Eric Friedman delivered a damning assessment of child abuse on its platform that any shareholder with a functioning moral compass should find alarming. Friedman admitted Apple’s reality as the following: “we are the greatest platform for distributing child porn, etc.” In the same documents, Friedman provided evidence in the form of an upcoming trust and safety presentation at Apple that listed “child predator grooming reports” as both an “active threat” and an “under-resourced challenge.”
It’s bad enough that Apple execs describe the company as such. But executives like Friedman see such threats with a degree of separation—they clock out, go home, and the reality of Apple’s inability to curb online child sex abuse doesn’t go home with them. But the same can’t be said for “Jessica” and “Amy,” pseudonymous defendants in a current class action lawsuit against Apple. These two women live with the reality of the abuse they experienced as children every day, every time law enforcement informs them that videos of their sexual abuse—literal child pornography—have been uncovered on another Apple device. An iCloud account in California. A MacBook in Virginia. And the list grows by the day.
Apple’s choice not to combat CSAM, if the class action lawsuit succeeds, could cost the company north of $1 billion in shareholder money. It’s a choice that gives children less protection than many online platforms provide adults. If Apple wants to prove its sincerity on combating online child abuse, it could start by answering some very basic questions. Twenty-five-year-olds using modern dating apps have sexually explicit pictures blurred by default in the name of user protection. Why won’t Apple do the same for 15-year-olds on iMessage? Shielding children under the age of 14 against explicit content is excellent—so why aren’t similar protective features turned on for children over the age of 14? There isn’t a “levels of maturity” question when it comes to which children deserve to be protected from sexual content on Apple’s platforms. The answer should be all of them. So why isn’t it?
Maybe Apple has good answers to such questions, but if so, they should not be opposing a resolution calling upon them to share those answers with the rest of us. From a statement filed with the SEC from Bowyer Research: “Apple may have an honest and intelligible rationale for not deploying CSAM detection protocols—but shareholders deserve to know what that rationale actually is.”
The proposal regarding child sexual abuse material is not the only encouraging sign. For the first time that I’ve ever seen, every shareholder proposal on the Apple ballot is from conservative groups, with a preponderance focused on protection of children from distortive sexual identity ideologies. Aside from our proposal, the National Legal & Policy Center is urging Apple to issue a risk analysis regarding the company’s ethical use of AI, including protection of children’s privacy. The National Center for Public Policy Research is requesting that the company officially terminate its divisive and non-fiduciary diversity, equity, and inclusion (DEI) efforts. And Inspire Investing is asking Apple to justify its sponsorship of the Human Rights Campaign, with reference to its pressure campaign for companies to pay for sexual transition treatments for minors.
Some might object that Apple is doing the right thing by shareholders, because selling these apps is profitable. But at what cost? At the cost of causing grotesque harm to children and youth? Let’s remember Jesus’s words about millstones around necks. The political, reputational, and legal backlash against profiting off of sexualizing children is a this-worldly risk that no business can ignore.

These daily articles have become part of my steady diet. —Barbara
Sign up to receive the WORLD Opinions email newsletter each weekday for sound commentary from trusted voices.Read the Latest from WORLD Opinions
Jonah Wendt | It’s not the big companies that simply follow the new stream
Nathan A. Finn | The once-powerful policy was on the ropes even before Donald Trump’s executive order
Eric Patterson | Let’s remember what this war is about
Craig A. Carter | Palestinian Arabs reject the idea of living beside Israel in peace
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.