Rep. Jay Obernolte, R-Calif., the chairman of the House's AI Task Force Associated Press / Photo by Rich Pedroncelli

LINDSAY MAST, HOST: It’s Wednesday, the 7th of May.
Glad to have you along for today’s edition of The World and Everything in It. Good morning, I’m Lindsay Mast.
NICK EICHER, HOST: And I’m Nick Eicher.
It’s Washington Wednesday. Our main story today is one that is probably not suitable for younger ears. It has to do with AI generated explicit deepfake images of real people. That’ll get started shortly, so you’ve got time to decide whether it’s appropriate.
MAST: First, an update on the federal budget.
Last week, the White House released its spending blueprint for the new fiscal year. It calls for a 23-percent reduction on discretionary spending by Congress.
Cedarville University economics professor Jared Pincin spoke with WORLD’s Washington bureau … He said it’s a long time coming.
PINCIN: I don’t know if I can remember a time where they’ve actually cut spending—like actually cut spending.
EICHER: Even so, the budget doesn’t touch entitlement programs that add the most to the deficits and debt. Republicans like Congressman Rich McCormick of Georgia say it’s just a start—and that’s putting it generously.
MCCORMICK: Remember, non-discretionary spending makes up 10% of the budget. And we expanded the military, so obviously, 10% of the budget and we’re talking about reducing [it] by 24%, we’ll take 1/10 of that. That’s 2.4 percent of the budget.
MAST: The budget also seeks an additional 13% for defense, and that accounts for more than a trillion dollars.
Congress will be hammering out the numbers in the weeks to come. More on that as the process unfolds. In the meantime, our Washington team will keep you posted online at wing.org, and we’ll have a link for you to follow in the program transcript.
EICHER: Last week, the House passed the TAKE IT DOWN Act—one of the first AI focused bills to win congressional approval. It requires social platforms to remove non-consensual intimate deepfakes within 48 hours—and criminalizes their publication. WORLD Washington bureau reporter Leo Briceno explains what this means for victims and tech companies.
LEO BRICENO: Noelle Martin was 18 when she searched for images of herself on Google.
MARTIN: Before I ever found out what was happening to me, it was already too late.
Strangers online had taken images from her social media and manipulated them into pornographic images. Law enforcement told her there was nothing they could do to bring the perpetrators to justice. And websites were slow to remove the images, while users online copied them and posted them elsewhere.
MARTIN: I could never ever get on top of the situation because it had spread too much.
Today, Martin, who works in partnership with The University of Western Australia, is an advocate against the creation and spread of intimate deepfakes.
It’s an old problem supercharged by artificial intelligence.
MARTIN: It’s become a fully fledged online industry. The biggest sites that have been created that are dedicated to this abuse have amassed billions of views.
Lawmakers in the U.S. have taken notice. Last Monday, Congress passed a bill looking to address those harms through the TAKE IT DOWN Act. It’s one of the first AI-related bills to pass Congress. Here’s Congressman Jay Obernolte of California, former Chairman of the Bipartisan Task Force on Artificial Intelligence.
OBERNOLTE: In one way, I’m very happy; Obviously as a co-sponsor of the bill, and someone who’s been very vocal in expressing my belief that non-consensual intimate imagery is something we should all be able to agree is not okay.
The bill passed in an overwhelming 409 - 2 vote. Advocates take that as a good sign it will become law.
GAETAN: Sometimes an issue is so important it takes Congress by storm and thankfully Take It Down took Congress by storm.
Eleanor Gaetan is Vice president and Director of Public Policy at the National Center on Sexual Exploitation or NCOSE.
GAETAN: It requires online platforms to remove non-consensual sexually explicit material within 48 hours of being notified. And then it criminalizes the publication of those images whether they’re actual images of a real person or AI generated images that look exactly like a person.
Penalties for violating the act could result in three years behind bars, and/or a fine of up to 250-thousand dollars ($250,000). The law would apply in instances of interstate or foreign commerce.
So what about the votes against the bill?
Kentucky Congressman Thomas Massie had concerns about the penalties.
MASSIE: It’s so vague. My great concern is that the platforms that would be liable for hosting this stuff are just going to put on such a strong filter that a lot of stuff gets filtered out. And then also I think they’re going to be on a hair trigger to take stuff down given the severity of the punishment if they don’t.
He’s also worried about what the two-day requirement means for smaller web platforms that may lack capacity to police their content.
MASSIE: Like what’s a small startup web hosting thing that’s like the next Facebook or the next Twitter, they’re gonna have to spend you know a lot of money trying to make sure that this problem this regulation doesn’t—they don’t run afoul of this regulation and it may keep them from getting started up.
Massie says he would like more clear guidelines in the language on what it means to create an image in someone else’s likeness.
Gaetan, the Vice president at NCOSE, believes that innovation will be able to help address some of Massie’s concerns.
GAETAN: There is extremely successful AI tools to identify imagery that would comply with this law and get rid of it. So we believe in the incredible success and genius of these companies to be able to solve so many problems including this one.
When asked about the two-day window, the bill’s co-sponsor Congressman Obernolte told me that he’s expecting legal challenges, but that the line has to be drawn somewhere.
OBERNOLTE: To anyone who objects to the two-day provision, my question is, well, what is the right amount of time? Two days, I think as we get into implementation of the bill, we’ll see if that’s too long or too short, or just right.
Last year, Australia amended its Criminal Code to give platforms 24 hours to take down non-consensual deepfake content. But Noelle Martin says companies regularly ignore those requirements.
MARTIN: Some of the challenges that we have is websites not actually complying with any requests or they are taking too long to respond. And then by that time, things are reposted, amplified, resurfaced.
Martin says the burden of compliance should be shared by everyone involved—users, platforms, AI companies, and more. She believes that’s the only way to attack the problem.
MARTIN:Like you have to do both. It can’t be one emphasis on just the removal without also holding the whole pipeline to account, otherwise the response is going to be superficial and won’t tackle the root causes of this.
So what can families do to protect their children from this? Some Christian researchers say it’s time to get off—and stay off—social media.
MORELL: The platforms and its algorithms celebrate vice rather than virtue.
Clare Morell is a Fellow at the Ethics and Public Policy Center in Washington, DC.
She says that families should think carefully about whether they use social media, since the pictures and videos young people post online can be exploited.
MORELL: Be really careful who you share photos with, who you text photos to and be very careful uploading them online. I personally would advise parents to not allow their children under 18 on social media for this reason. But if they are, then they should be really cautioned about being careful knowing that that can sadly be used against them.
The TAKE IT DOWN Act is now headed to President Donald Trump’s desk for his signature.
Reporting for WORLD, I’m Leo Briceno in Washington, D.C.
WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.