Missing the heart in music | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Missing the heart in music

0:00

WORLD Radio - Missing the heart in music

With advances in AI music, people lose the connection to an actual artist


MARY REICHARD, HOST: Today is Thursday, March 7. Thank you for turning to WORLD Radio to help start your day. Good morning. I’m Mary Reichard.

MYRNA BROWN, HOST: And I’m Myrna Brown.

REICHARD: Coming next on The World and Everything in It: Artificial Intelligence and Music. We’re finding new uses for AI all the time and for many in the music industry, the tools are really cool.

BROWN: They are, but at what cost? I talked to a musician asking that question and who’s also determined to keep the art and the heart of music alive.

SOUND: [PRACTICE ROOM STUDENTS SINGING, PLAYING INSTRUMENTS]

MYRNA BROWN REPORTING: Inside tiny practice rooms, sopranos hit high notes, pianists perfect chord progressions and songwriters craft melodies and lyrics. Music bounces off the walls at the University of Mobile’s Alabama School of the Arts.

MYRNA: And you used to walk these halls right?

GARRETT: Yeah, I was here for four years…

That’s Garrett Romine. He plays keys, the guitar and the drums. A graduate student at the university, he says he’s not surprised AI has become part of the music equation.

GARRETT ROMINE: When you think about recording, you used to record on a tape and it mattered. And now you just record on a computer. And so, we’ve already kind of seen one huge shift in music.

Romine also sings and writes his own songs.

MUSIC: [“LIGHT HAS COME” PERFORMANCE]

In 2017, while still an undergraduate student he wrote “Light Has Come,” a song that was featured in one of the school’s major productions and later published in a choral book. Romine says when it comes to songwriting, AI doesn’t measure up.

GARRETT ROMINE: Not that long ago, I just typed in to an AI App like, write a song about this. And it spit it out and I was like…. Ehh. I think you can kind of tell at this point, what’s totally written by a computer or by AI or not.

MYRNA: How?

ROMINE: It just felt clunky. It wasn't particularly moving. I think that’s the missing part of AI. As smart as it is, it doesn’t really have heart.

AUDIO: [ANDREA MARTONELLI PLAYING INSTRUMENT]

Two thousand miles away, Andrea Martonelli agrees with Romine. But Martonelli, a grad student from the Queen Mary University of London, believes there are new possibilities for instrumentation and technique using AI.

SOUND: [MUSIC TRADE SHOW CHATTER]

Today the musician is one of 16-hundred exhibitors showcasing AI technology at a global music trade show in Southern California.

AUDIO: [ANDREA MARTONELLI PLAYING INSTRUMENT]

A small crowd gathers around his booth to hear him beat his snare drum and strum his guitar… at the same time AND using the same instrument.

ANDREA MARTONELLI: The way it works is that it’s got a little bit of simple artificial intelligence. It tracks whether you’re coming down on the heel or coming down with the fingers and it’s consistent all across the body.

He calls it a HiTar, an advanced guitar with artificial intelligence sensors that read movements to make both drum and synthesizer sounds. Trade show attendee, Ken Preece can’t take his eyes off of Martonelli’s hands moving rapidly from the body of the guitar to the fretboard.

KEN PREECE: This is going to change guitar playing, no doubt about. If I can touch the guitar, get different sounds and depending how I hit it, I get another different sound, that’s next level.

AUDIO: [VIRTUAL INSTRUMENT]

Across the pond, one of Martonelli’s classmates is working on another AI tool. Extended reality—or XR—is a way of extending the physical reality that we live in. Max Graf sits at a wooden desk in a dark studio at London’s Queen Mary University. Graf wears a white augmented reality headset. With a strap across the top of his head, the huge goggles that cover his eyes make him look like a sci-fi character. But the getup enables him to play the virtual instrument he calls “Netz”.

MAX GRAF: The instruments that I envision in the future and also instruments that I want to build myself are instruments that harness physical instruments or have new physical instruments with counterparts in the virtual world.

This AI tool allows Graf to track hand movements to create corresponding outputs like notes and chords.

But what about live music that extends beyond the four walls of a studio? Garrett Romine, back in Alabama says those are the dots AI can’t connect.

GARRETT ROMINE: We still want to hear music that we like and see the people behind it. And I don’t know if that will change. And so even if AI music and generative AI makes cool music, I still don’t think in a massive way that replaces the fact that people want to be connected to the person that wrote it or the person who sings it.

Reporting for WORLD, I’m Myrna Brown.


WORLD Radio transcripts are created on a rush deadline. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of WORLD Radio programming is the audio record.

COMMENT BELOW

Please wait while we load the latest comments...

Comments