top of page

Google Magenta Studio: A Free AI Music Generator

Google Magenta Logo

Over the past few months, we've been covering OpenAI applications like MuseNet and Jukebox. These are some of the most cutting edge AI music applications to date. But OpenAI wasn't the first company to venture into this space. Google started publishing generative music applications a few years earlier, under the name Magenta.

The Magenta Team first presented the machine learning model at a Moogfest workshop in 2016. A year later, Google Brain researcher Douglas Eck gave the following talk on TensorFlow, to help explain how artificial intelligence learns to play virtual musical instruments.

On January 26 2023, Google published a paper on a new machine learning product called MusicLM that turns text prompts into music.

Google Magenta was received well by the press, but there were some technical challenges that kept musicians from joining the fun. You would need to clone Magenta from Github, install it locally and run TensorFlow. This was tricky to set up for non-developers and would crash on computers with insufficient memory.

Introducing Magenta Studio

Fortunately, Google rolled out a set of music creativity tools called Magenta Studio in 2019. These standalone apps (and the Ableton plugin) are compatible with both MacOS and Windows devices. This was an important moment where the Google Magenta project became accessible to everyday music producers.

March 21st that year, they also published a classical music generator on Google's homepage. Powered by Magenta's neural network, the mini-game let you write a short melody and turn it into a piano composition in the style of J.S. Bach. The experience looked great visually and the music generation was impressive too.

I had tried that Bach music generator, but never got around to Magenta Studio. To see what the hype was all about, we recently downloaded the app collection and spent a couple hours testing them.

How to Access and Use Magenta Studio

Google Magenta Studio
The Magenta Studio Collection

To download Magenta Studio, visit their website and pick from the Ableton Live Plugins or the Standalone Plugins. Unzip the file and you’ll find five different applications to choose from; Continue, Drumify, Generate, Groove, and Interpolate.

  • Continue - Similar to MuseNet, Continue lets users upload a MIDI file and leverage Magenta’s music transformer to extend the music with new sounds. Keep the temperature close to 1.0-1.2 more MIDI output that sounds related to the original input.

  • Drumify - Drumify creates grooves based on the MIDI file you upload. They recommend uploading a single instrumental melody at a time, to get the best results. For example, upload a bass line and it will try to produce a drum beat that compliments it.

  • Generate - Maybe the closest tool in the collection to a ‘random note generator’, Generate uses a Variational Autoencoder (MusicVAE) and has trained on millions of melodies and rhythms within its dataset.

  • Groove - This nifty tool takes a MIDI drum track and uses Magenta to modify the rhythm slightly, giving it a more human feel. So if your music was overly quantized or had been performed sloppily, Groove could be a helpful tool.

  • Interpolate - This app asks you for two separate MIDI melody tracks. When you hit generate, Magenta composes a melody that bridges them together.

Constructing an AI Music Stack

You can use any of these applications on their own, or you can chain them together in what I have started calling an ai music stack. For example, I used Generate to produce MIDI, Continue to elaborate on it, and Groove to humanize the variation that I like.

This Generate > Continue > Groove stack can be run twice to produce two MIDI files. Then you can feed those into Interpolate to hear how Magenta fuses them together. After going through this cycle a few times, I started getting a feel for what Magenta's melodic mind sounds like.

Using Magenta for Creative Inspiration

Magenta has been used by an open source community to produce over a dozen user-friendly browser apps. Some of them will surprise you!

In this 2019 product demo, Google's DeepLocal and Magenta teams partnered with indie pop band The Flaming Lips to... turn fruit into music?

The keyboard player used a fruit piano as a MIDI controller and routed their output through a Magenta application called PianoGenie. When he touched a piece of fruit in the bowl, Magenta played back a note of its choosing. Frontman Wayne Coyne played along with the music and a song was born.

When performing their AI song live, Flaming Lips replaced the organic fruit with giant balloon imitations. Here's what that that looked like:

This performance offers a glimpse into the ways AI music could push boundaries of musical creativity. At the same time, there's little indication that AI music apps will be replacing professional musicians any time soon.

How Magenta Generates Music

In this video, a research scientist at Google’s Magenta Project provides a detailed description of how Magenta works.

“We represent the audio in terms of the frequencies present at each moment in time, using what’s called a spectrogram. Then we run them through hierarchical neural network stacks called LSTMs or RNNs.” - Jesse Engel, Google research scientist

Engel goes on to explain that these neural networks are using prediction to compose new music. So when you feed it MIDI, the algorithm is attempting to guess what should come next, based on the dataset it was trained on.

This software shows that generative music will continue to evolve over time. AI music composition workflows, like the branching node tree used over at MuseTree, may even become a staple feature of our DAWs some day.

Visit Magenta's get started page for a general introduction to their product. If you'd like to explore the more advanced neural network applications, check out the Google Colab environments they curate. I recommend NSynth, where you can practice extending MIDI in a simulated software dev environment.

Using AudioCipher with Magenta

Let's bring this all down to earth a bit and try an experiment.

Our MIDI plugin, AudioCipher, generates melodies to help musicians break through creative block. We provide an interface that lets you type in words and convert them into melodies, in any key signature.

Magenta Studio includes Continue, an app that will extend any melody you create with AudioCipher. All you need to do is generate and design your seed melody, export the MIDI file, and load it into Continue. Hit generate and Magenta will send your AudioCipher melody through its neural network to produce fresh MIDI output.

AudioCipher and Magenta

Once you've generated the MIDI files with Magenta, navigate to the file folder and give them a listen. You can drag them back into the DAW to hear the melodies played through your virtual instrument, which will sound better.

In my experience, Magenta tends to be a bit all over the place with its melodies. But I did notice that it's capable of making novel, interesting decisions that I might not try otherwise.

So enough talk - let's get creative. Try this pairing and let us know what you think. Were you able to draw new ideas and inspiration from Magenta?

Magenta Studio vs OpenAI MuseNet

For those of you who have been following along, you might be wondering how Magenta does compared to MuseNet.

Overall, Magenta Studio was fun and easy to use. As a free tool from one of the biggest software developers in the world, I would encourage any musician to grab a copy and try it at least once. There’s nothing to lose and you never know when you might stumble on something interesting.

While Magenta is a worthy competitor to MuseNet, I think OpenAI's software offers better MIDI output.

Magenta Studio didn't quite meet my expectations for quality. For a state-of-the-art deep learning algorithm, the MIDI output seemed to devolve into chaos pretty quickly. The original MIDI input was often lost immediately as soon as the render began.

Other Magenta apps, like Interpolate, offered better quality in my opinion. I could upload two melodies and let it write something to go in the middle. The fixed constraints seem to help it stay focused and write more reasonable melodies. I'll add that Interpolation doesn't seem to be an available feature on MuseNet.

For more ideas about how to generate songs ideas with artificial intelligence, check out our review of the best AI music apps for 2022.


bottom of page