Fake pornography, real victims | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Fake pornography, real victims

TRENDING | As doctored images of women circulate on the internet, laws against deepfakes haven’t kept up


Illustration by Raúl Arias

Fake pornography, real victims
You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

When Noelle Martin was 18, she googled her name and found photos of herself that had been doctored, making it look like she had participated in a pornographic photo shoot. Martin then discovered the photos had circulated on dozens of pornography websites. About six years later, she received an email from a sender she did not recognize, telling her that someone had made a deepfake pornographic video of her.

A deepfake is a video that has been altered by artificial intelligence technology to make realistic-­looking, but phony, footage of a celebrity or other person (for instance, the video of Ukrainian President Volodymyr Zelenskyy telling his troops to surrender). Deepfake pornography is footage of a person performing sexual acts that he or she never actually engaged in.

The technology is new and dangerous to both victims and pornography users, with some experts saying it has the potential to be even more destructive than the older forms of pornography. American laws, meanwhile, are way behind in dealing with it.

Only a few decades ago, people could find pornography in a few smutty magazines and video stores. But with advances in technology, accessibility of pornography has skyrocketed. “We’re carrying around the largest library of pornography ever created in the history of mankind in our pockets,” said Sam Black, director of life change education at Covenant Eyes.

Today, many children first view pornography between the ages of 8 and 11. People often get trapped into what Black calls a “porn rut” because of early exposure, and this habit becomes incredibly difficult to overcome. “Now we’re going to up the ante with artificial intelligence,” said Black. “As if there wasn’t enough pornography, and enough different genres.”

The term deepfake first emerged in 2017 after a Reddit user began posting AI-doctored pornography. Just six years later, according to research company Sensity AI, “The top four websites dedicated to deepfake pornography received more than 134 million views on videos targeting ­hundreds of female celebrities worldwide.” The company reports that 96 percent of deepfakes are pornographic.

And with the way tech is advancing, anyone can be the subject of AI pornography. According to NBC News, a deepfake creator on Discord charges $65 for deepfakes of anyone. “There’s a lot of, I think, misunderstanding about this form of abuse,” said Noelle Martin. “That it only happens to celebrities or public figures when a lot of everyday women are being targeted, but we just might not hear about it.”

Often, deepfake videos are made by layering the victim’s face onto a preexisting video. To “train” the AI program, creators load pictures and videos onto the computer so that the system can learn how to replicate the subject’s likeness and place it on another body. The subject and the actor in the original footage need to look at least similar to ensure a seamless product.

Internal flaw-detectors called Generative Adversarial Networks spot problems within the video and learn to correct them automatically, making deepfake pornography videos hard to distinguish from the real thing. Though making a deepfake requires time and skill to dub the audio correctly, many AI-doctored clips look pretty convincing.

Dr. Don McCulloch, a licensed ­psychologist who specializes in helping clients overcome pornography addiction, explains that pornography is so alluring because of something called the Coolidge effect. This phenomenon relates to the brain’s search for novelty. As viewers build up tolerance to seeing the lewd images and videos, they begin looking for more stimulating content—younger subjects or more violent acts, for example. Now that people can immediately find pornography customized to their exact preferences, their dopamine threshold will peak much more quickly, leading to greater abuse and sexualization in the real world.

AI technology was used to produce a fake video of Ukrainian President Volodymyr Zelenskyy calling on his soldiers to lay down their weapons.

AI technology was used to produce a fake video of Ukrainian President Volodymyr Zelenskyy calling on his soldiers to lay down their weapons. Olivier Douliery/AFP via Getty Images

With the advent of AI pornography, McCulloch believes that pornography will become even more addictive and have devastating consequences on ­marriages. “People who consume ­pornography have an issue with warped reality. … The ability to determine between what’s real and what isn’t is going to be lost,” said McCulloch.

For victims of AI porn, the consequences are very real. As Martin noted, “The emotional violation of seeing yourself depicted in such a way, it was extremely dehumanizing.”

Martin has used her experience to encourage legal reform regarding deepfakes in her home country of Australia. At this point, U.S. law has not fully caught up to the tech. Even the actors in the original adult films might not be able to sue for copyright violation since many deepfakes fall under Section 107 of the Copyright Act that protects “transformative works,” or works that have been significantly altered from the original.

Because AI-generated pornography depicts a person’s likeness rather than the actual person, most state laws are not specific enough to criminalize deepfakes. While 48 states protect targets of nonconsensual pornography—imagery distributed without permission for the purpose of slandering someone’s reputation—only four states have laws specifically about deepfake pornography, according to the Cyber Civil Rights Initiative.

Lawmakers in several states are beginning to introduce legislation to criminalize the activity. Distributing revenge porn is illegal in California, but Republican Rep. Tri Ta wants to update the legislation in his state to keep up with the tech. A bill scheduled to come before a legislative committee in late April details that perpetrators of AI porn could face a $1,000 fine or one year in prison.

New Jersey GOP state Sen. Kristin Corrado introduced a bill last month that would treat deepfake pornography with the same severity as nonconsensual pornography. If found guilty, ­perpetrators could face up to $15,000 in fines. Since child pornography is a ­second-degree crime in New Jersey, anyone who possesses or distributes AI porn that depicts abuse of minors might face up to 20 years in prison, if the bill is approved.

Updating the law is a step in the right direction. At the same time, perpetrators might be operating in states or countries that do not have prohibitions on AI porn. “The push,” said Martin, “is to figure out some sort of global solution to this borderless global issue.”


Bekah McCallum

Bekah is a reviewer, reporter, and editorial assistant at WORLD. She is a graduate of World Journalism Institute and Anderson University.

COMMENT BELOW

Please wait while we load the latest comments...

Comments