MARY REICHARD, HOST: It’s Tuesday, the 18th of July, 2023.
Glad to have you along for today’s edition of The World and Everything in It. Good morning, I’m Mary Reichard.
NICK EICHER, HOST: And I’m Nick Eicher.
First up on The World and Everything in It: Artificial Intelligence.
Last week, the news was full of AI. From secret briefings at the Senate to Elon Musk’s new company, x-AI, to the new Mission: Impossible. What’s going on, how should we understand the opportunities and threats?
REICHARD: Joining us now is Jason Thacker. He’s an assistant professor of philosophy and ethics at Boyce College in Louisville. He also is the director of the research institute at The Ethics and Religious Liberty Commission.
Good morning, Jason.
JASON THACKER: Morning, Mary. Thanks for having me.
REICHARD: Last Tuesday, the Senate held a closed-doors meeting to discuss the national security threats of artificial intelligence. Now, we don’t yet know the specifics of what they heard, but later in the week Senate subcommittees held hearings to discuss copyright violation and defamation risks.
It’s ironic that on Wednesday, Tom Cruise’s new movie about a rogue artificial intelligence system hit the big screen, essentially driving home the point that some AI systems are just too powerful and dangerous to control.
Jason, what are some potential real-life national security threats of artificial intelligence?
THACKER: Yeah, I love the irony in that is because I think this kind of reveals to us that AI is not some kind of far off flung reality that maybe we'll one day have to address; it's actually affecting us deeply now. It's shaping our understanding of God, ourselves, as well as the world around us. So we need to be not only thinking about some of the future potential and some of the dangers, but also a lot of the real world kind of influence now. But in terms of kind of the national security debate, there's a lot of different areas that we can touch on. But I think some of the big ones in terms of misinformation and information warfare, the idea not only being able to share these things widely, but also to create them widely, especially with generative AI. We have the potential and the ongoing use of cyber warfare and increasingly using artificial intelligence in that, as well as a lot of the military uses of these things with our weapons and tech and weapon technology, especially increasingly putting humans in what are known as off the loop in the sense of not being directly control of some of these systems and how quickly they're gonna have to respond.
REICHARD: Meanwhile, on Friday, tech entrepreneur Elon Musk unveiled a new company called xAI. Here he is explaining it in a Twitter Spaces recording on Friday. You’ll hear him mention AGI. That stands for artificial general intelligence. It’s the ability of a system to perform the variety of tasks humans do at the same level or higher.
ELON MUSK: I guess the overarching goal of xAI is to build a, a good AGI with, with the overarching purpose of just trying to understand the universe. I think the safest way to build an AI is actually make one that is maximally curious, and, and truth seeking. So, like, you know, this, this won’t ever actually get fully to the truth, it's not clear, but, you want, should always aspire to that.
Musk went on to talk about how AI could help researchers understand things like dark matter, gravity, and where all the aliens are.
Jason, this venture sounds like something straight out of Hitchhikers’ Guide to the Galaxy. What do you think Musk is likely to find with this company?
THACKER: Yeah, I find it really interesting in the way we frame that because there are a lot of ways that artificial intelligence will be used and is already being used to discover new, new aspects of reality and objective truth, especially a lot of empirical or scientific truths. But pairing that with AGI and saying that this is definitely a possibility, or something we're trying to pursue is interesting to me, especially because general AI is a dream for many. We're not even sure if it's possible from a philosophical standard, especially from a faith perspective, if this is even something obtainable, to create a human-like intelligence, we haven't been able to do that before. Everything is known as more narrow artificial intelligence. But I really think it comes down to that idea of the material world can't really reveal a lot of the meaning and the value that we're looking for even a lot of the moral truths that we look for. It's not grounded in nature. It's actually grounded in a supernatural power. It's grounded in God Himself. And I think that's something interesting as we start to navigate a lot of these questions. There are a lot of potentials, but there are also significant dangers and a lot of overhyped promises that I think we need to be aware of in a lot of these debates.
REICHARD: Well, in other news, the Instagram companion app called Threads was the number one downloaded program last week, that is up until an app called Remini stole the spotlight. It's designed to improve the quality of photos and videos. But on Thursday, the Wall Street Journal reported that users were able to go a step further and add elements to their pictures. And in particular, there's a feature to generate pictures of babies based on the features of adults in the uploaded pictures. I mean, it's no joke, the app creates photo realistic portraits of families that don't even exist.
Now here's where the article gets interesting. Dalvin Brown, the reporter, interviewed several people who were using the app this way who are childless in their 20s and 30s. Brown says that this app is helping people to envision themselves as parents down the road and implies that that's a very good thing. So is it a good thing, or is something wrong with this picture?
THACKER: Well, I say that it's kind of both/and in many ways. I think there's some kind of good things that reveals even the centrality the family. I think, often in a society that prioritizes kind of the isolated individual, and even makes us more isolated with our tools, reminding us that it's not, we can't just do life alone, there actually is the centrality of the family, God's good design for men and women and children. So it's revealing in some sense, in a good way. But in other ways, it's also showing a lot of the loneliness, I think that we experienced in our society, how addicted we are to our devices, and how many even in the story have put off childbearing, put off relationships to focus on their career. I also think it can give us kind of a false sense of control and a false sense of power in the sense that we are able to create these things. There are a host of issues in terms of privacy and data-related issues in terms of the handling this type of information, uploading these type of photos to the server. So I would encourage and and caution listeners to be very careful about the things that we upload online, especially into a lot of these AI based systems, because there are a lot of future ramifications and a lot of stuff we haven't even figured out yet, but there's a lot going on in that story, obviously, but I do think it's interesting, kind of a reminder of how we've been created in the longing for family and community.
REICHARD: Jason Thacker teaches at Boyce College and directs research at The Ethics and Religious Liberty Commission. Jason, good talking with you. Thank you.
THACKER: Yeah, thank you for having me.
WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.