Moises Introduces AI Studio. It’s Awesome.
If you’ve read my blog or heard me speak about AI in music education, you probably already know that I am a HUGE fan of Moises.ai. It is a perfect tool for music educators, musicians, and students to go inside any music recording - separating tracks, adjusting tempo and key, finding the chords and lyrics of the song, and much, much more. I always demonstrate the technology behind Moises when I meet with music educators. They recently announced an incredible new instance of their technology called Moises AI Studio, and it’s awesome. Here’s what it does, and why you should definitely check it out. I think it can greatly enhance the songwriting process, and jumpstart any creative music making endeavor.
Moises has long been one of my favorite AI-powered music apps because of its powerful stem‑separation tools - enabling users to isolate vocals, drums, bass, and other instrumental layers. These tools have make it easier for students to study arrangement, deconstruct their favorite pieces, and build deeper analytical listening skills. Now, Moises has stepped up their game by not just allowing users to not just deconstruct music, but to co‑create it. AI Studio listens to the audio tracks that you upload and then generates instrument parts - stems - that align with its rhythm, harmony, and style. This means students or teachers can upload a melody, a chord progression, or even a rhythm track, then ask the system to “play along,” offering bass lines or rhythmic accompaniment that are musically responsive and editable. And it works really well.
For music educators teaching composition or arranging to their students, this tool offers an extraordinary opportunity. You can set the musical context - tempo, phrasing, harmonic direction—and students can experiment with generating unique instrumental layers. In the classroom, this becomes a living demonstration of how harmony, style, and rhythmic interplay coalesce. You might ask students to compare a stem generated with strong harmonic alignment versus one with freer, experimental feel - watching firsthand how the AI’s “weights” influence the outcome Moises.ai.
Perhaps the greatest pedagogical feature is that AI Studio keeps the student - not the algorithm - at the center. I think it’s a perfect example of how to balance students generating content and AI generating content for the same project. Teachers can encourage students to regenerate stems, tweak prompts or references, and engage in genuine creative problem‑solving. Here’s a quick video on how it works:
Here’s a quick step-by-step walkthrough to get you (or your students) creating in Moises AI Studio.
Open AI Studio and sign in. Use the web app or desktop app—both work the same way. From the home screen, choose “AI Studio.”
Start a project. Upload a short idea (riff, melody, chordal loop) or begin with a prompt. AI Studio analyzes your timing and tone so anything it generates locks to your performance.
Pick what you want to add. Choose a stem type—drums, bass, guitar, keys, etc.—to “play along” with your idea.
Choose a style. Apply a Genre Preset (e.g., funk, rock, pop) so the AI knows the groove and feel to target. You can change this later.
Set the vibe and expressiveness. Use the Creative Control options to make parts more laid-back or more active (fills, dynamics, articulation).
Generate the stem. Preview what AI Studio creates against your source. If it’s close but not perfect, tweak the style/controls and regenerate. You keep what you like and replace what you don’t.
Build out sections. Highlight a region and use the Smart Selection Generator to fill transitions, add variations, or extend a groove without breaking flow.
Add more instruments. Repeat the process for additional parts (e.g., bass + drums + rhythm guitar) until the arrangement feels balanced.
Mix quickly. Turn on AI Mixing, pick a genre/instrument preset, then fine-tune levels and clarity so parts sit together.
Master and export. Enable AI Mastering, set your sample rate, then export either the full mix or individual stems for your DAW or performance tracks.
One quick suggestion: keep student inputs short (8–16 bars) for fast iterations, then use regenerate/variation passes to compare arrangement choices in real time—great for discussing groove, voicing, and texture.
Moises AI Studio is free to try right now, so I would definitely suggest kicking the tires - even with your students. I’m looking forward to seeing how this technology grows over the next year or two. I love what this team is doing and I think music educators should seriously look into how they can incorporate it into their teaching.