Of course, TikTok knew
Leaked memos reveal that the platform’s executives were well aware of how dangerous their product is for children
Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
Earlier this month, another bombshell burst over the heads of the beleaguered tech industry when 14 state attorneys general announced coordinated lawsuits against TikTok for knowingly and willfully exploiting young users. Recognizing that such users were the “golden audience” for the platform because of their inability to resist its addictive pull, TikTok executives designed their algorithms to attract teenagers in as little as 35 minutes while making only halfhearted efforts to suppress and remove dangerous and pedophilic content. If that wasn’t bad enough, sealed court documents in Kentucky’s suit were accidentally unredacted, giving reporters access to TikTok’s internal memos. The documents are utterly damning.
In one memo, TikTok’s research showed that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” But since young users were most likely to use the app compulsively, they were particularly targeted. “As expected, across most engagement metrics, the younger the user, the better the performance,” said one document.
Under public pressure to behave more responsibly, TikTok introduced a “screen time nudge” that would supposedly help users disengage but first determined through internal tests that it would have virtually no effect—reducing daily usage from 108.5 minutes to 107 minutes. Not to worry, said a project manager: “Our goal is not to reduce the time spent”—just to reduce negative PR.”
Other documents show the company was well aware that its beauty filters and algorithms could encourage eating disorders, that its content filters had only a 64 percent success rate in blocking “normalization of pedophilia” posts, and that its TikTok Live feature was being used as an online strip club for paid teen performers. The memos also reveal that TikTok knew that plenty of standard users were under age 13, violating federal law, but took little action to remove such underage accounts.
In some ways, these revelations are not shocking at all. It’s not as if most of us don’t know that such apps are designed to be slot machines for minors. Nicholas Carr published his findings on what the internet does to our brains in 2010. Alexis Madrigal compared social media design to the techniques of the video gambling industry in 2013. And the viral documentary The Social Dilemma featured myriad interviews with industry whistleblowers in 2020. It would be shocking if TikTok executives didn’t know exactly how their products worked on the minds of children. What’s surprising is simply the brazenness of their prioritization of profit over health.
In this, the recent battle against Big Tech feels like a replay—at 10 times the speed—of last century’s battle against Big Tobacco. That began in 1964 with the surgeon general’s health warning followed by the 1965 Federal Cigarette Labeling and Advertising Act and culminating more than three decades later in a massive lawsuit settlement with 52 state and territory attorneys general. There, too, knowledge emerged that the industry was well aware of the harms of its product long before the surgeon general was, and it deliberately marketed to children to increase the chances of addiction. With the surgeon general issuing a warning about the harms of social media earlier this year and many states rushing to get smartphones out of schools, we may well wonder whether firms like TikTok are headed for a day of reckoning.
Of course, when it comes to such product liability cases, we must strike an appropriate moral balance. On the one hand, it is very easy for a consumeristic society to intermittently look for scapegoats after guiltily bingeing on some product or another. While the tobacco industry certainly engaged in disinformation, plenty of people knew at some level that cigarettes were bad for them and kept smoking anyway. Similarly, in this case, I doubt many parents are surprised to read that TikTok is harmful to their children, but the attorneys general aren’t prosecuting them for child abuse.
On the other hand, there’s a reason why we treat addictive products differently, especially for children. While consumers ought to take moral responsibility for the products they buy and platforms they use, it is simply not a fair fight when corporations exploit chemical and psychological dependencies to bypass our decision-making—especially if they are singling out those whose brains have not fully developed such executive functions. And while parents have the first duty to protect their children from such products, we have long recognized that Mom and Dad need good laws to help them. Parents shouldn’t let their kids drink alcohol or visit the casino, to be sure, but we also demand that such places of business refuse service to minors.
The recent TikTok revelations are likely to give fresh momentum to legal efforts to require app stores to verify users’ ages, and Christian parents should be the first to cheer such efforts.
These daily articles have become part of my steady diet. —Barbara
Sign up to receive the WORLD Opinions email newsletter each weekday for sound commentary from trusted voices.Read the Latest from WORLD Opinions
Brad Littlejohn | How conservatives can work to change our culture’s hostility toward families
Jonathan Butcher | What the election means for Christianity and racial politics
Kayla Toney | A California elementary school hides gender ideology that conflicts with a family’s religious convictions
Matthew Malec | Combining resources and resolve to combat additional abortion votes that are sure to come
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.