Artificial intelligence and American elections | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Artificial intelligence and American elections

The viability of our political institutions requires new expressions of virtue


President Joe Biden speaks about artificial intelligence at the White House on July 21, as tech executives stand nearby. Associated Press/Photo by Manuel Balce Ceneta

Artificial intelligence and American elections
You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

Every day seems to bring a new headline about the ways in which AI (which I’m told from a reliable source stands for “artificial intelligence”) is predicted to impact our world. In some cases these claims are exaggerated, but in others the focus is only on the short-term impacts while the longer term consequences of this kind of technology are discounted or ignored.

We’re living in an age of information abundance, and that information increasingly comes from unreliable or even suspect sources. Twitter is (or at least was at the time of Elon Musk’s purchase) rife with bots. The Tik Tok algorithm skews towards content intended to dumb down the youngest generations of Americans. Instagram seems almost purposely designed to depress teenage girls.

But the next iterations of machine-generated content, including from sources like ChatGPT and deepfake videos, promise ever-greater levels of sophistication and deception. Most spam emails remain pretty easy to spot. But AI-generated text, audio, and even video is increasingly difficult to differentiate from the genuine thing. On one level we might think about this kind of content as banal and unobjectionable, akin to elevator music for social media marketing content. But on another level, we really have yet to see this kind of technology weaponized for nefarious purposes on a larger scale.

Consider, for instance, the potential impact of AI-generated content on political elections. Enemies and adversaries of the United States, whether acting under the guidance of Russian or Chinese governments or non-state actors like Anonymous, have long sought to influence domestic politics, including through social media channels. These emerging technologies provide new tools to fool the electorate and the media.

Many elections are decided by razor-thin margins, and electoral data exists that show which demographics, counties, and districts are the most significant. Bad actors could easily target these, and would only need to be successful in a small percentage of the intended audiences to flip the results from one side to another. These kinds of campaigns could be designed to suppress the votes for one side, increase turnout for another side, or both. If the machines themselves are difficult to hack, the electorate could be hacked by fake news and disinformation.

We need to develop powerful personal, cultural, and spiritual habits that make us less vulnerable to manipulation by fake news and conspiracy theories.

We haven’t yet seen a widely influential effort of this sort, but it is only a matter of time before we encounter this on the national stage. Think of the potential impact of fake video content that plausibly shows a major presidential candidate in a compromising position, or saying something offensive about some significant constituency. What would the consequences be if an artifact like this were released to the public the day before an election, or even on Election Day itself? Would the mainstream media be able to responsibly curate stories about such a video? And even if they were able to, do they retain significant enough trust as reliable guides to influence people towards the truth?

A number of large tech companies have come together to start working towards voluntary, industry-wide standards and agreements on best practices. This approach is far preferable to a governmental “ministry of truth” that predigests and dispenses information approved for public consumption. But much more needs to be done. There are some promising possibilities from blockchain technology that might be leveraged to verify the source of media content, for example.

And while there will undoubtedly be some technical tools that can be used to address different aspects of this problem, the challenge itself is far greater than can be solved by mere technique. We need to develop powerful personal, cultural, and spiritual habits that make us less vulnerable to manipulation by fake news and conspiracy theories.

We need to be more savvy about what we see and hear everyday, whether that’s something repeated in a personal conversation or reposted on a friend’s Facebook feed. The physicist Richard Feynman had some words of wisdom that apply well here: “The first principle is that you must not fool yourself and you are the easiest person to fool.” If you read something, especially something negative about someone else, and it is the kind of thing that would make you feel good if it were true, then you should approach that possibility with even greater caution.

We might even put it this way: The commandment to love our neighbor as ourselves means that we ought to have a higher burden of proof before we believe something bad about someone else relative to believing something that is good or even neutral.

AI may not transform everything about our world, but over time it will touch almost everything. We are encountering only the very initial tremors of a technology that could do great harm to our civic institutions and our political life. There are concrete steps each one of us can take to begin to face these challenges. And, as in so many things, the first step is to recognize our own fallibility, limitation, and sinfulness, and, as the Apostle Paul puts it, keep our minds focused on godly things and pursuing peace: “If it is possible, as far as it depends on you, live at peace with everyone” (Romans 12:18).


Jordan J. Ballor

Jordan is director of research at the Center for Religion, Culture & Democracy, an initiative of First Liberty Institute, and the associate director of the Junius Institute for Digital Reformation Research at Calvin Theological Seminary and the Henry Institute for the Study of Christianity & Politics at Calvin University.


Read the Latest from WORLD Opinions

Marc LiVecche | As Israel defends itself against Tehran and its proxies

Nathan A. Finn | Discovering our neighbors in times of tragedy

Ted Kluck | On childhood memories, gambling, and finishing well

R. Albert Mohler Jr. | An attack on a prominent classical Christian school shows the movement has the liberal establishment running scared

COMMENT BELOW

Please wait while we load the latest comments...

Comments