Making Music With IBM Watson Beat
Grammy award-winning music producer Alex Da Kid paired up with Watson in 2016 to create a whole new sound. IBM just released the Watson Beat code to the public, and we got to work.
Have you ever found yourself in a creative rut? As a musician, this happens to me all the time. After IBM released the Watson Beat code to the public on GitHub yesterday, I have a new muse to turn to for inspiration. This Artificial Intelligence (AI) tool is designed to compose music. Traditionally, people think of AI as simply data and analytics. But Watson Beat leverages AI to turn data into a creative expression. It’s like nothing you’ve ever seen (or heard) before.
A new evolution in music composition
Watson Beat is IBM’s music arm of Watson. Its AI-based software understands how to make music based on two filters: emotion and style. But it’s smart enough to understand the entire composition of a music file—from the melody to the chorus to the time signatures. Watson Beat’s machine learning technology takes those inputs and completely rearranges them, giving you a fresh way to look at the same chords.
Take a song like “Mary had a Little Lamb” and assign it the emotion of romantic and the style of reggae. Watson Beat will take those parameters and return a track that completely reimagines this well-known nursery rhyme.
The T3 Innovation team decided to test out this new tool and see what we could create in a matter of hours.
Crafting a track in less than two hours
When IBM released the Watson Beat code our team wasted no time getting to work. We gave Watson Beat a MIDI file comprised of three instruments (vibraphone, synthetic strings, and synthetic bass), four bars and 60 beats per minute.
In less than an hour and a half, we produced a 1:40 second track based entirely on that original MIDI file and Watson Beat’s deconstruction and ultimate reconstruction of it. The result sounds nothing like the original input. It shows how Watson’s AI filter can completely reimagine the version of what we provided.
As a musician, this was mind blowing. Watson was able to give me a composition faster than I could have ever gotten it out of my head. And that’s just scratching the surface of what Watson Beat can do.
Based on the guide rails you put in place, Watson Beat will continue to spit out composition after composition, giving musicians endless inspiration and an entirely new way to look at the creative process.
Using AI to amplify inspiration
So why is all this important? Aside from the obvious benefits of being able to crank out tunes fast, Watson Beats is another example of why we shouldn’t be afraid of AI. Many people see AI as the replacement of humans. But at T3, we think AI is about the augmentation of humans. Through Watson Beat, I was able to amplify my natural human abilities to take my music further than I thought possible.
Through this marriage of technology and humans, we can get to the next evolution in music, inspiration, and who knows what else.
To be continued…
Inspiration and composition by IBM Watson Beat. Produced by Brandon Gredler. Engineered by Joshua Brewer. Percussion by Austin Hegarty.