Live Sampling While Live Coding

By Matthew B. Pohl

The concept of live coding is foreign to many people, making it a difficult draw for even small-scale performances outside of a very niche community. Except for the curious musician or coding enthusiast, a live coding environment does not have the same appeal as a rock group, string ensemble, or jazz band would in the confines of the music community. With this knowledge in mind, I proposed the following idea to David Hume and Martin Suarez-Tamayo, two of the Fall 2018 Ensemble’s members: investigate the integration of live musical performance and a live instrument within a live coding environment, namely the TidalCycles live music coding language and a compatible IDE in Atom. In this way, there could be a functional bridge between the audience interpretation of traditional gestural means and the commonly sedentary presentation of live coding.

While it is entirely possible to have any acoustic instrument perform alongside the sound processes executed via Atom (or David Ogborn’s extramuros software when in a group setting), the goal was to integrate a live performance into the code to manipulate it live. The TydalCycles programming language is heavily reliant on the SuperDirt synth, which is activated through the SuperCollider IDE, and on the pre-programmed samples of the SuperDirt synth. We discovered that the SuperDirt samples are located in a user-accessible folder called “Dirt-Samples,” the contents of which could be modified freely without causing errors to that end. Therefore, one could effectively sample a live instrument into any DAW, export the bounced file into a user-created folder within “Dirt-Samples,” and call upon the sample in Atom. This is the process which we followed.

Any instrument can be recorded and sampled, whether mic’d or recorded acoustically in the case of a violin or saxophone, or as recorded from a digital instrument such as a keyboard or electronic drum kit. To avoid having to deal with the problems acoustic instruments face in a live sound environment (levels, compression, and EQ to name a few) as the performance was ongoing, and due to potential space constraints, we optioned to use a digital piano (Roland FA-08) as the performance instrument. The output was sent from the keyboard to a MOTU audio interface, in which the sound from the piano was mixed with the sounds produced from the USB-connected computer and sent to the main mixing board for distribution out to the eight loudspeakers.

The actual performance, without going into significant detail, was comprised of the first two measures of Erik Satie’s 1¬¬ere Gymnopédie which we sampled into Ableton Live at 72 bpm, coinciding with the metrical value in Atom, cps (72/60/6). As the sample was being bounced, exported, and relocated to a user file in our local copy of “Dirt-Sounds,” I continued to play through the piece by improvising on a central theme of the work. The fact that the audience had a (partial) view of a performer improvising on a relatively well-known piece of music while the sample was being created functioned as a discreet separation from the awkward first moments of silence live coding often entails by providing for them a motive to follow throughout the performance.

This is a vital distinction between what live coding itself is perceived as versus what it can result in as part of a collaborative environment. I feel that, as live coding integration matures into a distinct musical art form as opposed to the more-or-less novelty that it presently is, it should be the responsibility of the performer and the orchestrator to find ways that live coding can be intertwined with common musical practice. While this is not a new idea, perhaps this is one step towards performing coders being able to create saw-wave melodies live for an eighties tribute band or live drum loop patterns for a modern pop-rock group, coming soon to a bar or club near you.


Using Sampling in Popular Music

by Reyse Jaster

Over the past year, I have been investigating the use of samples in popular music. Samples take various forms in music. In rock music, it is becoming increasingly popular to replace a live drum sound with one that has been previously recorded. The previously recorded sound, also known as a sample, is considered to be superior. In other instances, samples have come to be a main component in hip-hop and many electronic music genres. This method of creating music is one that is uniquely modern and worth exploring.

The first time that I really came to appreciate sampling was when I heard the album Endtroducing… by the artist Josh Davis, also known as DJ Shadow. This album is credited as being the first album created entirely by samples. Although the album clearly has roots in the hip-hop production that Davis grew up with, the resulting music is not clearly definable. One does not necessarily be a fan of hip-hop to enjoy this album. Much like a sculptor make use found objects to create their piece of art, Davis uses found portions of recorded music to create something new and compelling. A short section of piano is removed from its original context and juxtaposed against a funk drum sample. Despite the fact that each component of the piece is not created by Davis, the result is greater than the sum of its parts. Even if I knew the origin of a sample in one of these pieces, I still felt as if Davis had made it his own.

Using modern software and hardware, I have been learning to create my own music using samples. This process has increased my appreciation of this type of music. Modern hip-hop sampling originated in the late 1980s with the advent of new hardware that made the process more efficient. Equipment such as the Akai MPC gave hip-hop producers the freedom to experiment with samples in the digital realm. They no longer had to physically cut and splice pieces of magnetic tape. My equipment of choice, the Native Instruments Maschine, could be considered a modernized version of an Akai MPC. The touch sensitive pads, and button layout are similar to the MPC, but instead of being a standalone piece of equipment, the Maschine instead functions as a tactile software controller.

Although one could certainly create a sample-based piece of music by exclusively using software with a computer keyboard and mouse, Maschine provides a more musical approach. The user is able to ‘play’ a sample with the touch sensitive pads, similar to playing a note or chord with a traditional acoustic instrument. The user can assign any sound they wish to a pad.

When creating a painting, the available colours are essentially endless. This is also the case in sample-based music creation. When you wish to use an existing portion of music to create a new work, you must be able to think of how that portion will stand alone. This thought process can be a daunting task in itself. However, one must then determine how multiple portions of unrelated music will work together. At this stage, and individual’s creatively can truly flourish.

Current technology allows samples to be incorporated easily into music. The onus now lies on the musician to expand the possibilities of this type of music. I hope to foster this potential, and look forward to the new music it will bring.