Senate bill would force social media sites to remove deepfake porn
Lawmakers on Tuesday introduced a bill to the U.S. Senate that would require social media companies to remove deepfake pornography from their platforms within 48 hours of a victim’s request. The Take It Down Act would also make publishing or threatening to publish deepfake pornography a federal crime. Sen. Ted Cruz, R-Tex., and Sen. Amy Klobuchar, D-Minn., are the main sponsors of the bipartisan bill.
What is deepfake pornography? Creators of deepfake pornography use artificial intelligence to make sexual images of real people. Victims featured in deepfakes are mostly women and range from celebrities like Taylor Swift to high school girls. Pornography makes up 98 percent of all deepfake videos online, and pornographic deepfake videos increased by 464 percent from 2022 to 2023, research by Home Security Heroes reported.
What else have lawmakers done to combat deepfake pornography? U.S. senators in January introduced a competing bipartisan bill that would allow victims to sue creators and distributors of deepfake pornography. Twenty U.S. states have laws addressing deepfake pornography, but there is currently no federal law specific to the issue.
Dig deeper: Listen to Mary Muncy’s report on The WORLD and Everything In It podcast about deepfake technology.
An actual newsletter worth subscribing to instead of just a collection of links. —Adam
Sign up to receive The Sift email newsletter each weekday morning for the latest headlines from WORLD’s breaking news team.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.