Google’s Instrument Playground is an AI-powered experience that lets you define an instrument and start playing with its sound in seconds.
Similar to some of the looping features of synthesizers or Apple’s GarageBand, this experience gives you a virtual keyboard and maps a 20-second generated clip to different keys using Google’s MusicLM that MusicFX Also gives strength.
AI music is a fast-growing space, with tools like Snoo capable of creating songs complete with vocals and lyrics from text prompts, powered by Rightsify Hydra II, which is fully trained on licensed music. And even provides you with device stems.
Even through Amazon’s Alexa Skill and Adobe AI are working on music-related tools that do anything from simple song generation to full control over output.
How does Instrument Playground work?
MusicLM is an artificial intelligence model trained on musical instrument samples and designed to convert text to music, similar to trained AI image generators.
Like Suno, Meta’s MusicGen or MusicFX, Instrument Playground can emulate hundreds of instruments from around the world, but unlike those tools it focuses specifically on manipulating an instrument, like a synthesizer might do. Is.
It begins by creating a 20-second clip using the device as a prompt. Once it’s ready you can use the keyboard (or middle row keys on a laptop) to create a song.
First launched late last year, the tool was created by Simon Dury, Google’s artist-in-residence. “The starting point for this experiment was to find a playful interface based on Music LM that inspires creativity and the discovery of instruments from around the world, for everyone,” said Dorey.
The universal nature of the instruments is one of the most interesting aspects of playing with this instrument, especially mixing two such instruments together.
How well does Instrument Playground work?
Starting the Instrument Playground is a bit tricky because you’re working with a generated track. It doesn’t work like a piano because it’s composed on a per-song basis, notes aren’t mapped directly.
You can use some of the experiments with individual instruments in this tool on Google MusicFX, to combine those instruments into a complete, AI-generated song.
I asked him to give me a spacious flute to work with and after a few moments he made a short clip of the flute sounding a bit cheesy, and a keyboard to accompany the vocals. .
You can also enter Advanced Mode and add up to four different instruments, creating an entire track from each sound’s looping elements and AI-generated samples.
Why is this a big deal?
On the surface Google’s Instrument Playground is just that, a fun sandbox for AI to play with. But under the hood is a powerful AI music model and, as we’ve seen with MusicFX DJ Mode, could one day lead to a new form of music production.
Imagine being able to come up with a new instrument that might not exist in real life – a glass string piano that only plays in the vacuum of space – and being able to map it to a keyboard.
While there is controversy about the data provenance of AI music tools, and concerns that being able to create any Bing song will have far-reaching implications for the value of music—the fact that these tools are used by musicians will be best used so that nothing can be created. heard before.