Live Sampling While Live Coding

By Matthew B. Pohl

The concept of live coding is foreign to many people, making it a difficult draw for even small-scale performances outside of a very niche community. Except for the curious musician or coding enthusiast, a live coding environment does not have the same appeal as a rock group, string ensemble, or jazz band would in the confines of the music community. With this knowledge in mind, I proposed the following idea to David Hume and Martin Suarez-Tamayo, two of the Fall 2018 Ensemble’s members: investigate the integration of live musical performance and a live instrument within a live coding environment, namely the TidalCycles live music coding language and a compatible IDE in Atom. In this way, there could be a functional bridge between the audience interpretation of traditional gestural means and the commonly sedentary presentation of live coding.

While it is entirely possible to have any acoustic instrument perform alongside the sound processes executed via Atom (or David Ogborn’s extramuros software when in a group setting), the goal was to integrate a live performance into the code to manipulate it live. The TydalCycles programming language is heavily reliant on the SuperDirt synth, which is activated through the SuperCollider IDE, and on the pre-programmed samples of the SuperDirt synth. We discovered that the SuperDirt samples are located in a user-accessible folder called “Dirt-Samples,” the contents of which could be modified freely without causing errors to that end. Therefore, one could effectively sample a live instrument into any DAW, export the bounced file into a user-created folder within “Dirt-Samples,” and call upon the sample in Atom. This is the process which we followed.

Any instrument can be recorded and sampled, whether mic’d or recorded acoustically in the case of a violin or saxophone, or as recorded from a digital instrument such as a keyboard or electronic drum kit. To avoid having to deal with the problems acoustic instruments face in a live sound environment (levels, compression, and EQ to name a few) as the performance was ongoing, and due to potential space constraints, we optioned to use a digital piano (Roland FA-08) as the performance instrument. The output was sent from the keyboard to a MOTU audio interface, in which the sound from the piano was mixed with the sounds produced from the USB-connected computer and sent to the main mixing board for distribution out to the eight loudspeakers.

The actual performance, without going into significant detail, was comprised of the first two measures of Erik Satie’s 1¬¬ere Gymnopédie which we sampled into Ableton Live at 72 bpm, coinciding with the metrical value in Atom, cps (72/60/6). As the sample was being bounced, exported, and relocated to a user file in our local copy of “Dirt-Sounds,” I continued to play through the piece by improvising on a central theme of the work. The fact that the audience had a (partial) view of a performer improvising on a relatively well-known piece of music while the sample was being created functioned as a discreet separation from the awkward first moments of silence live coding often entails by providing for them a motive to follow throughout the performance.

This is a vital distinction between what live coding itself is perceived as versus what it can result in as part of a collaborative environment. I feel that, as live coding integration matures into a distinct musical art form as opposed to the more-or-less novelty that it presently is, it should be the responsibility of the performer and the orchestrator to find ways that live coding can be intertwined with common musical practice. While this is not a new idea, perhaps this is one step towards performing coders being able to create saw-wave melodies live for an eighties tribute band or live drum loop patterns for a modern pop-rock group, coming soon to a bar or club near you.