Bridging the DAW and the Soundbox
Integrating Ableton Live’s timeline into Max/MSP and the implications for music composition
A master’s thesis in Elektroakustische Musik
at the Hochschule für Musik “Hanns Eisler” Berlin
September 9th, 2023
Sample Text: Introduction
Before computers became the predominant tool for creating electronic music, composers and performers were routinely faced with the limitations of their physical equipment. Audio equipment was expensive; for people without access to professional sound studios, a 4-channel mixer might have been the only tool available. Today, composers and performers in the same position have access to a quasi-infinite expanse of (relatively) inexpensive tools through digital software. Yet while it’s never been easier to get access to the tools to create music, it’s also never been more challenging to decide which kinds of structures to impose onto the creative process.
For someone who usually works in a DAW, the limitations of structure will likely be all too familiar. Most modern DAWs are modeled after the analog mixing consoles of the 20th century, with their skeuomorphic design concept imposing many of the same limitations that the phys- ical hardware used to – especially regarding signal flow.
However, in softwares like Max/MSP, Pure Data, or SuperCollider – a style of software that I’m going to refer to as a “Soundbox" – signal flow doesn’t have to be linear. Additionally, unlike a DAW, Soundboxes allow integers, floating point numbers, and strings (messages) to be sent and received by any point in the program, allowing users to trigger sound-actions with custom-built interfaces or envelope-followers (just to name a few examples). Soundboxes don’t impose rigid, standardized structures for interfacing with sound; instead, they let the user invent their own.
While Soundboxes offer many advantages over DAWs when it comes to creativity, the very freedom that they provide comes with it’s own drawback: a total lack of structure. Without any obvious direction of signal flow or a clearly labelled “Master Channel”, composers who only have experience in DAWs might feel a bit like they’re suddenly floating in the void of outer space. And unlike DAWs, which visualize time as a spacial dimension – usually from left to right, like sheet music (occasionally from top to bottom) – Soundboxes leave their users to ex- perience time in ‘real-time’. Without this fundamental temporal overview, even intermediate and advanced Soundbox users may struggle to organize their ideas and/or create pieces of music that have a strong sense of narrative.
So what if the two paradigms could be combined? What if it were possible to take the most advantageous structures from a DAW and use them in combination with the openness and nonlinearity of a Soundbox? In this paper, I will discuss one of the possible methods of creating a hybrid workflow between Ableton Live (DAW) and Max (Soundbox), a technique that emerged out of necessity as I worked on a recent commission. I would like to share the results of my research and experimentation, in the hopes that it may offer technical and conceptual solutions for problems faced by other electronic music composers.