AWS DeepComposer – 1st Impressions from a Music Producer

Author: Rex Reyes
Last modified: February 24, 2021

Background

Artificial Intelligence and Machine Learning have been in music creation for quite a while, (doing various things like correcting a singer’s pitch) but over the past few years I have seen apps, plugins and features that use machine learning to actually compose music.  I tried quite a few of them when they first started coming out, but the results I got felt like monkeys at a typewriter trying to belt out that Shakespearean play.  As such, they never made it into my studio as a permanent everyday tool.

As an AWS Select Consulting Partner, Cloud Brigade has accomplished several AWS Machine Learning projects.  So, when they sent our CEO, Chris Miller, an AWS DeepComposer keyboard, he offered it up to me knowing about my passion when it comes to digital music production. While I do understand the keyboard is intended to be a beginning tool for developers to get into machine learning and music, I figured my blend of software development and music creation knowledge might be what it takes to make good use of this solution. 

Challenge

Can an AWS machine learning software compose music? If so, how do we incorporate the AWS DeepComposer tool into a music studio, with a mix of hardware and software, and three different DAWs (Digital Audio Workstations)?

Benefits

  • Easy-to-use music composition tool
  • Provided automation and predictive analysis 
  • Integration with Ableton Live, SoundCloud and Musescore

Business Challenges

  • Irresolvable Complexity: Successfully integrated machine learning tool with music production software
  • Inefficient Systems and Processes: AWS DeepComposer gives music producers the opportunity to automate process while keeping individual creative integrity

Solution and Strategy

So the first thing I decided to do was figure out the workflow. How could I incorporate the AWS tool, as developer centric as it is, into a music studio?  My home music studio is pretty complicated.  I have a mix of hardware and software, and actually use three different DAWs.  When I am composing however, I usually stick to Ableton Live (digital audio workstation and instrument for live performances as well as a tool for composing, recording, arranging, mixing, and mastering), and needed to understand if I could move MIDI (Musical Instrument Digital Interface) data between Ableton Live and AWS DeepComposer.  I started up Live and loaded a song I was working on.  Knowing AWS DeepComposer did its magic based on a melody you give it, I exported a MIDI clip of the bassline. You can listen to here:

“AWS’ machine learning devices have really opened the door for our staff to explore the seemingly infinite opportunities to apply AI, and to produce new solutions to problems we weren’t even thinking about.”

-Chris Miller, Founder and CEO

Technical Hurdles to Overcome

Then I uploaded the AWS DeepComposer Music Studio and under the “Choose Input Track” dropdown and selected “Import a Track.” AWS DeepComposer returned some errors so it was time to troubleshoot.  Even though AWS says the MIDI file should be “8 bars or less,” I found it complained when I tried to import a 4 bar file, so I re-exported a 8 bar loop.  Then I got a message about missing BPM (Beats per Minute) data. After looking into Ableton’s MIDI export function, I found users reporting it does not embed BPM data.  What I needed at this point was a simple app I could use to add the BPM info.  After trying out a few, the one I found worked best was the notation software (and free) Musescore.  All I had to do is open the MIDI file and export as .mid.

So now I had my own “melody” in AWS DeepComposer and it was time to see what it can do.  I hit “Continue” but got an “Input track required” message.  Turns out you need to hit the “Edit Melody” then “Apply Changes” to get the MIDI file to register.

To continue, you’ll choose one of these three Machine Learning techniques: 

  1. Adding or subtracting notes from your melody
  2. Generating accompaniment tracks
  3. Extend your track by adding notes

Wanting to hear what it would do out of the box, I went with the 2nd option called GAN, as it had 5 pre-built models.  I picked rock, and hit “Continue.”  This was the result:

This Was No Ordinary Project

What you get is a 5 track composition (guitar, bass, pad, drums, and your original melody) played back through your computer’s built in synthesizer.  Much like all the other machine learning music generators I have tried, the initial result is often pretty chaotic sounding and not really usable as is, and unfortunately, your only choice of output/export is to upload an audio file to SoundCloud.  This is an AWS limitation, but my goal is to use this as an actual music tool and not just a cute demo.  After looking around quite a bit, I discovered how to get the MIDI into my DAW by using the “classic music studio.” 

In classic mode, there is a “Download Composition” button at the top to download a multitrack MIDI file of the loop AWS DeepComposer just created.  I had to rename the file extension to .mid (instead of .midi), but then it was ready to drop into my Ableton Live grid view. In this case I thought the Synthesizer Pad track had potential, so I chose a nice dreamy sound from Massive X, and tweaked it a tiny bit:

Results

Added it to my original bassline as well as some drums, and we have a nice little loop:

“I’m passionate about digital music production, and was super excited for the opportunity to blend by software development expertise to create the perfect little tune.”

-Rex Reyes, senior developer

Future Opportunities

So in the end, I did end up with a nice little chord progression, and one I will probably end up using in the final version.  Considering this is a tool not really intended for a music studio, it was pretty easy to get something usable from it.  As it stands, I will use it sporadically, but if I can figure out a way to route MIDI in and out of the app in real-time (my next goal), then it becomes a lot more usable.  I also plan on working with our ML guru, Matt Paterson, to see if we can’t create some of our own models.  I would love compositions requiring less cleanup, and ones more geared to my music genre. I am not sure about Amazon’s plans for DeepComposer, but I don’t think it would take too much work to get it integrated with professional music apps, and would be all in for that.

Download the full story here.

What’s Next

If you like what you read here, the Cloud Brigade team offers expert  Machine Learning as well as Software Development services to help your organization with its insights. We look forward to hearing from you.

Please reach out to us using our Contact Form with any questions.

If you would like to follow our work, please signup for our newsletter.

About Rex Reyes

With website development as the core focus of his 20 year career, Rex has evolved his skills, continuously learning and adapting with the Internet times. As the “dotcom” days exploded, Rex jumped into animation and Flash multimedia interactive website designs for international retail enterprises, as well as became a jack-of-all-digital-trades for a leading music and lifestyle magazine resulting in his fluency in PHP, JavaScript, CSS and HTML. Acquiring his Bachelor and Master of Fine Arts degrees in Painting lent well to the more creative digital projects Rex accomplished in marketing design and print production. Rex is able to easily balance his logical and creative sides, and thrives on challenges where he has the freedom to push creative limits while still enabling functionality for top clients like Franklin Covey and URB Magazine. When Rex isn’t behind a computer screen, he may be producing music, DJing, painting, or traveling with his family.

CONTACT US

Would you like more information, or have an idea we can help become a reality?

Contact Us
Email Us
Call Us 831-480-7199
Newsletter Subscription