The Impact of AI on the Music Making Process

Last week I gave a talk on AI in music education. One of the main messages that I was trying to convey was that while there are many ways that AI tools can be used to help music teachers spend more time making music with their students, and less time on administrative tasks, AI in the music creation process is a far more problematic topic. We know that whether we like it or not, Generative AI is now part of the daily lives of our students, and more specifically, their musical lives - both in what music they listen to, and tools that help them create music. Using tools like Suno and Udio, they can type a prompt and instantly hear a finished track. The music that is created is often impressive, but they have no agency over it as they did relatively nothing in the creation process. For many teachers, that feels unsettling at first. I recently came across a TED Talk on YouTube given by Dustin Ballard titled “Why I Use AI to Ruin Your Favorite Song.” I was immediately intrigued and pressed play. I am glad that I did. In his talk, Ballard offers a helpful way to think about AI in the music making process. Instead of asking whether AI will replace musicians, he asks what makes music meaningful at all.

The music industry, and music education has always lived through technological change. Recorded sound, synthesizers, and sampling all raised concerns when they appeared. None of them removed musicianship. They simply changed what musicians needed to understand. Ballard, like others, places AI in that same category. The tools evolve, but the role of the teacher does not. Our job remains, and will always remain to help students understand what makes music making and music creation so important in society. What I like most about his TED Talk is that he takes the audience through the processes that he uses to create music using various AI tools - everything from AI voice models to replace his recorded vocal tracks, to lyric writing tools. It’s nice to see these tools used in context - especially while he goes through his thought process in deciding to use them.

Ballard encourages simple expectations for AI use. Be honest about how something was created. Use the tool with intention. Think about the people whose music inspired the result. Those ideas fit naturally into rehearsal and project based learning. Instead of banning tools, teachers can frame responsibility. If you are going to use AI-powered music creation tools with your students, have them focus on understanding the finished product. Students can test harmonies faster, explore instrumentation more freely, and compare stylistic options in real time. Because creation takes less time, listening takes more importance. The teacher becomes a guide to judgment rather than a gatekeeper of access.

This leads directly to questions students already care about. They notice when an AI voice sounds like an artist who never recorded the song. That moment opens conversations about authorship, copyright, and artistic identity. These discussions used to belong mostly in college classrooms. Now they belong in every music room. From my experience with younger students, they have strong feelings about using AI to create music - mostly negative - and that gives me great hope for the future of music making.

Ballard’s YouTube channel, There I Ruined It, is a LOT of fun because he takes two seemingly opposite songs and combines them using AI tools. The resulting videos combine are funny at first but quickly raise questions. Why does this feel uncomfortable? Should the listener be told AI was involved? Is imitation acceptable if the intent is parody? Students will engage quickly because the examples are short and memorable, and use music from a wide variety of genres. The ethical conversation starts naturally instead of feeling forced. Teachers can build structured discussions from these reactions. When does technology support creativity and when does it eliminate a human from the equation? How does respect for artists fit into these new tools. Do they respect those artists? What responsibility does a creator have to the listener? These are not abstract questions. Students encounter them whenever they open an AI music generator.

None of this replaces performance. Ensemble rehearsal, technique development, and musical expression still matter deeply. AI simply surrounds those experiences with more opportunities to explore ideas of ownership and agency. A student might generate several melodic options and explain which one best serves the musical goal. Assessment becomes explanation rather than speed. In my opinion, Ballard’s perspective is a great conversation starter in the classroom. I urge you to check out his channel, find a track that you think will resonate with your students, and play it for them. See what they think. Here is one of my favorites from his channel. It combines Black Sabbath and John Denver.

Music teaching has always centered on thoughtful decision making. Technology supports that process rather than defines it. AI gives students more material to respond to and more chances to refine their taste. The question therefore becomes simple. Not whether AI belongs in the classroom, but how it can help students think musically. When guided carefully, it becomes just another tool that supports excellent teaching rather than replacing it. In the end, I think that is where AI will live once all of the hype, fear, and debates fade away.

Previous
Previous

A Choir In Every School

Next
Next

MusicFirst Elementary Now Available Through JW Pepper