Trump signs bill federally banning deepfake pornography
President Donald Trump signing the "Take it Down Act" with first lady Melania Trump, seated right Associated Press / Photo by Evan Vucci

President Donald Trump signed the so-called Take it Down Act into law on Monday, banning the creation of artificial intelligence-generated pornography that depicts people without their consent. It also prohibits the distribution of genuine intimate images of someone without his or her consent. The law aimed to combat the rise of online exploitation and so-called revenge porn, which refers to the vindictive sharing of inappropriate images of people–both real or AI generated–without their knowledge or consent. The Senate unanimously passed the bipartisan bill in February, before a landslide vote of approval from 409 members of the House in late April. Only two members of the House voted against the measure, and 22 others opted not to vote.
First lady Melania Trump rallied support for the measure under her initiative to improve children’s well-being. She addressed legislators before the Monday signing and thanked them for coming together and putting people over their political party. The weaponization of social media and AI has devastating emotional impacts and can even be deadly, Melania said. She described hearing stories from survivors about the emotional and psychological toll that came with deepfakes and sharing non-consensual intimate imagery.
How does the law work? The law aimed to empower victims of both real and deepfake non-consensual intimate image sharing on a federal level rather than depend on individual state laws.
The measure added a definition for “digital forgery,” or so-called deepfake pornography, defining it as realistic, computer-generated pornographic images and videos that portray real, identifiable people. Also, the law specifies that a victim who consented to the creation of an image or video is not legally considered to have provided consent for the media to be shared, according to the measure.
The act formally made it illegal for someone to knowingly post revenge porn—both real and fake—to any online platforms. It also criminalized the publication of non-consensual intimate images in interstate or foreign commerce.
The law provided support for good faith disclosures from the public to law enforcement to the assistance of victims.
The act requires web platforms to take down non-consensual intimate images once they’re reported by a victim. Social media platforms are now required to have procedures in place to remove reported content within 48 hours, according to the act. The Federal Trade Commission will oversee the enforcement of websites making reasonable efforts to remove flagged content.
The law also aimed to protect First Amendment rights by yielding to the modern “reasonable person” test requiring that AI-generated content appear indistinguishable from an authentic image of victims.
Dig Deeper: Read Elizabeth Russell’s previous report on the House vote for more background.

An actual newsletter worth subscribing to instead of just a collection of links. —Adam
Sign up to receive The Sift email newsletter each weekday morning for the latest headlines from WORLD’s breaking news team.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.