When the AI Tops the Billboard Charts: What “Walk My Walk” Says About Music, Ethics, and Education

This morning while watching CBS Mornings, there was a story about the new number one hit on the Billboard Country Charts today (Digital Song Sales) - a song called “Walk My Walk” by a band called by Breaking Rust. The reason it was being featured? The song was 100% written by AI, and Breaking Rust doesn’t actually exist. For some reason, this is the straw that broke this camels’ back. While listening to Nate Burleson talk about a new category for Billboard called AI-Generated music, I felt something shift inside my brain. It’s not really about the technology — we all knew this was coming — but more about what we as a society consider music.

Although I don’t really want to generate any royalties for the song, here it is for reference:

Is it a catchy song that sounds like MANY other country artists? Absolutely, but this hits different for me. This isn’t a “collaboration” between humans and machines. There’s no singer with a backstory, no songwriter refining a lyric until it feels right, no instrumentalist adding a final expressive touch. Walk My Walk is an artifact of computation — a statistical model of what country music sounds like. For many, this generative AI outcome is what is known as AI Slop. Yet it now sits at the top of a real Billboard chart beside artists who live, breathe, and tour their songs. That alone should give every music educator pause.

The issue isn’t that technology is used in the creative process — musicians have always embraced tools, from multitrack recorders to autotune. The issue is authorship, authenticity, and honesty. When an AI-generated “artist” competes in the same ecosystem as humans, it blurs the line between art and imitation. The listener can no longer trust that what they’re hearing is the product of human emotion or lived experience. Instead, it’s a well-trained facsimile built on massive datasets of human work — most of which were used without consent. Perhaps the simplest question of all is why? Why release music that isn’t made by a human in the first place? Is it to show off the technology? Maybe. But the cynic in me knows it’s all about making money. And just who is behind this fake band? While the songs are credited to Aubierre Rivaldo Taylor, I would imagine that is a totally made up name. Try Googling him/her. Again, the cynic in me thinks its just a way for Spotify to make money, cutting the creatives completely out of the process. The only thing that gives me comfort is that there ridiculously small royalty has only neeted the song about $9,000.

That raises profound ethical questions. Who owns the training data that made Breaking Rust possible? Were the voices, lyrics, and musical styles of real country artists fed into the algorithm to teach it what “authentic” sounds like? If so, those human creators were never compensated or credited. The AI’s “success” is built on the unacknowledged labor of thousands of songwriters and performers whose art was mined, remixed, and anonymized. Imagine explaining to your students that an AI song trained on the voices of Dolly Parton, Chris Stapleton, and Kacey Musgraves just outsold them all — and no one got paid for it. How do we teach respect for creativity and intellectual property when the very system they aspire to join rewards automation over artistry?

There’s also the question of transparency. Listeners who streamed Walk My Walk may not have realized it was generated by AI. The “artist” was presented with a name, a backstory, and even a social media presence — none of which were real. Do the listeners really care? I would hope that most would - but who knows. Should music distributors and charting agencies be required to disclose when a song is machine-made? I certainly think so, but right now there’s no rule that says they must. That means a computer can compete against a human for the same spots, awards, and royalties, without disclosing its origin. That’s not innovation — that’s deception. Can you imagine awarding a GRAMMY to an AI algorithm? Who would walk up on stage to accept the award?

As educators, we are uniquely positioned to unpack this moment, and to discuss it with our students who will grow up in a world with AI generated music. In the classroom, we teach students to value originality, to credit sources, to refine ideas until they express something authentic. When an AI “artist” earns commercial success, it undercuts that lesson. It tells young musicians that the industry now values replication over creation. We should use this news as a case study for critical discussion. Ask your students: If AI models are trained on existing songs without permission, is that plagiarism? What if the model generates something “new,” but its style and phrasing come directly from human recordings? Should AI music be eligible for awards or chart placement at all? Should there be separate charts for human and AI works? These aren’t hypothetical questions — they are becoming the central debates of modern music.

Of course another ethical question is about cultural appropriation. Country music is deeply tied to regional identity and storytelling. If an AI system trained largely on mainstream Nashville hits produces a “country” song, what happens to the diverse voices, traditions, and histories that have shaped that genre? When algorithms decide what “sounds country,” “sounds hip-hop,” or “sounds Latin,” they risk flattening entire cultures into predictable sonic templates.

The ethical path forward begins with transparency, consent, and education. Students must understand how AI models are trained, who profits from them, and why attribution matters. They should experiment with generative tools — not to replace musicianship, but to study how they work, what biases they reflect, and what they leave out. Most importantly, they should be reminded that their human experiences — the breakups, the triumphs, the community — are what make their music irreplaceable.

We can’t stop AI from composing. But we can teach our students to question it, challenge it, and define what they stand for as artists. That’s the real work of music education now. When an algorithm tops the charts, it’s not proof of progress — it’s a mirror. It shows us what the industry is willing to sacrifice for efficiency and novelty. In my opinion our job, as educators, is to help the next generation see beyond the hype and reclaim the human values that music was built upon.

Next
Next

What the UMG/Udio Deal Means for the Music Classroom