Of course, TikTok knew | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Of course, TikTok knew

Leaked memos reveal that the platform’s executives were well aware of how dangerous their product is for children


California Attorney General Rob Bonta at a news conference on Oct. 8 in San Francisco announcing a bipartisan coalition of state attorneys general filing suit against TikTok for exploiting children Associated Press/Photo by Minh Connors

Of course, TikTok knew
You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

Earlier this month, another bombshell burst over the heads of the beleaguered tech industry when 14 state attorneys general announced coordinated lawsuits against TikTok for knowingly and willfully exploiting young users. Recognizing that such users were the “golden audience” for the platform because of their inability to resist its addictive pull, TikTok executives designed their algorithms to attract teenagers in as little as 35 minutes while making only halfhearted efforts to suppress and remove dangerous and pedophilic content. If that wasn’t bad enough, sealed court documents in Kentucky’s suit were accidentally unredacted, giving reporters access to TikTok’s internal memos. The documents are utterly damning.

In one memo, TikTok’s research showed that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” But since young users were most likely to use the app compulsively, they were particularly targeted. “As expected, across most engagement metrics, the younger the user, the better the performance,” said one document.

Under public pressure to behave more responsibly, TikTok introduced a “screen time nudge” that would supposedly help users disengage but first determined through internal tests that it would have virtually no effect—reducing daily usage from 108.5 minutes to 107 minutes. Not to worry, said a project manager: “Our goal is not to reduce the time spent”—just to reduce negative PR.”

Other documents show the company was well aware that its beauty filters and algorithms could encourage eating disorders, that its content filters had only a 64 percent success rate in blocking “normalization of pedophilia” posts, and that its TikTok Live feature was being used as an online strip club for paid teen performers. The memos also reveal that TikTok knew that plenty of standard users were under age 13, violating federal law, but took little action to remove such underage accounts.

While parents have the first duty to protect their children from such products, we have long recognized that Mom and Dad need good laws to help them.

In some ways, these revelations are not shocking at all. It’s not as if most of us don’t know that such apps are designed to be slot machines for minors. Nicholas Carr published his findings on what the internet does to our brains in 2010. Alexis Madrigal compared social media design to the techniques of the video gambling industry in 2013. And the viral documentary The Social Dilemma featured myriad interviews with industry whistleblowers in 2020. It would be shocking if TikTok executives didn’t know exactly how their products worked on the minds of children. What’s surprising is simply the brazenness of their prioritization of profit over health.

In this, the recent battle against Big Tech feels like a replay—at 10 times the speed—of last century’s battle against Big Tobacco. That began in 1964 with the surgeon general’s health warning followed by the 1965 Federal Cigarette Labeling and Advertising Act and culminating more than three decades later in a massive lawsuit settlement with 52 state and territory attorneys general. There, too, knowledge emerged that the industry was well aware of the harms of its product long before the surgeon general was, and it deliberately marketed to children to increase the chances of addiction. With the surgeon general issuing a warning about the harms of social media earlier this year and many states rushing to get smartphones out of schools, we may well wonder whether firms like TikTok are headed for a day of reckoning.

Of course, when it comes to such product liability cases, we must strike an appropriate moral balance. On the one hand, it is very easy for a consumeristic society to intermittently look for scapegoats after guiltily bingeing on some product or another. While the tobacco industry certainly engaged in disinformation, plenty of people knew at some level that cigarettes were bad for them and kept smoking anyway. Similarly, in this case, I doubt many parents are surprised to read that TikTok is harmful to their children, but the attorneys general aren’t prosecuting them for child abuse.

On the other hand, there’s a reason why we treat addictive products differently, especially for children. While consumers ought to take moral responsibility for the products they buy and platforms they use, it is simply not a fair fight when corporations exploit chemical and psychological dependencies to bypass our decision-making—especially if they are singling out those whose brains have not fully developed such executive functions. And while parents have the first duty to protect their children from such products, we have long recognized that Mom and Dad need good laws to help them. Parents shouldn’t let their kids drink alcohol or visit the casino, to be sure, but we also demand that such places of business refuse service to minors.

The recent TikTok revelations are likely to give fresh momentum to legal efforts to require app stores to verify users’ ages, and Christian parents should be the first to cheer such efforts.


Brad Littlejohn

Brad (Ph.D., University of Edinburgh) is a fellow in the Evangelicals and Civic Life program at the Ethics and Public Policy Center. He founded and served for 10 years as president of The Davenant Institute and currently serves as a professor of Christian history at Davenant Hall and an adjunct professor of government at Regent University. He has published and lectured extensively in the fields of Reformation history, Christian ethics, and political theology. You can find more of his writing at Substack. He lives in Northern Virginia with his wife, Rachel, and four children.


Read the Latest from WORLD Opinions

David L. Bahnsen | Finding moral and economic clarity amid all the distrust and confusion

Ted Kluck | Do American audiences really care about women’s professional basketball?

Craig A. Carter | The more important question is whether Canada will survive him

A.S. Ibrahim | The president-elect is surrounding himself with friends of a key American ally

COMMENT BELOW

Please wait while we load the latest comments...

Comments