Thanks so much for this “old code”, @xavierriley! Using inspiration from it, I was able to realize a longterm dream in playing chords with a bit of “automated voiceleading”. I now have a Plugdata patch and PdLua objects which implement a version of what you did in SPi. (My coding skills are really fragmentary. I’ve been using LLMs to get me over multiple humps.)
As much as I love Sonic Pi, I spend much of my musicking life with plugins, so I needed something which worked there. Recently purchased RapidComposer, an expensive app/plugin which is remarkably difficult to use. Still, it does most of the things I want to do and a whole lot more. Including sophisticated voiceleading (and voicing).
Was struck by the “Chord Rules” idea. Gave me the nudge I needed to “extract rules” from a corpus of Jazz Standards. As a result, I have scripts (in Python/music21, then in NodeJS with Tonal) which generate chord progressions based on the probabilities for chord transitions.
That was already a “gamechanger”, for me. Partly because the results are exactly what I needed. The progressions generated aren’t perfect, of course. And that’s part of the point. I can now tweak the dataset based on my own preferences.
To do so, I much prefer auditioning these transitions in voiceled mode. I don’t know how much I can enjoy certain transitions between chords when they’re in root position. A large part of what I appreciate comes from voiceleading potential, I’m finding out. And I’m a big fan of open voicings.
So… Based on the fact that I was able to generate chord progressions, I tried to apply some voiceleading to those chords using similar techniques. It didn’t work properly in NodeJS… and it worked right away in Lua, thanks to Claude AI.
That was something of a revelation, last Saturday. After years of trying, I had something I could use (and more or less understood) which took a chordname (including in functional notation through altered Roman numerals, e.g. bIIImaj7) and a reference voicing to output a voiceled voicing for that new named chord.
After a bit of work, I was able to implement that in PdLua which allows me to run the whole thing as a plugin patch in any DAW (on the desktop; PdLua support on iPadOS is running into issues with paths and such).
So, really, Xavier, your code was what allowed me to overcome an obstacle I had for years (meaning, long before your original post in this thread).
Had heard of Tymoczko and had tried to understand some of his work. And I realize (now) that his arca Python code contains everything I’d need.
Yet I don’t have the skills needed to get much of this.
When I asked LLMs to transcode from Sonic Pi to JS, I ran into issues I couldn’t solve on my own. Something was off and I’m not sure what it was.
I could notice that some of the code was about converting things to pitch classes and then doing some processing between pitch class sets. That made sense to me. The “taxicab norm” also made some sense, though I found the explanation… “cumbersome”.
I eventually used the following (verbose) prompt, which led me to a working solution:
I have a Pd-Lua project with several milestones and possible extensions. Pd-Lua is a special version of Lua which integrates with PureData, especially in Plugdata (a new flavour of PureData which can work as a plugin on different platforms). Once I have the Lua code, I should be able to integrate it in a PureData object.
The first milestone is to create a transition between chords by making a transition matrix from a list of four MIDI note numbers (‘voicings[0]’) and an ordered pitch class set (‘chords[1]’).
To break that down in steps to do in Lua:
- I first need to convert each MIDI note number into its pitch class (which will become ‘chords[0]’). Should be easy. The pitch class of a MIDI note number is its modulo12.
- Then, I need to calculate the relative distance between two pitch classes, such as the first element of chords[0] and the first element of chords[1]. If it’s going from 0 to 11, for instance, the value would be -1. Going from 7 to 0 is 5.
- After that, I need to create a matrix with all of these relative distances (first element of the first ordered pitch class set with each element of the second, then the second element of the first pitch class set with each element of the second).
- I then sum the absolute values of each possible transition from one ordered pitch class set to the other.
- I can then pick the transition with the lowest sum of distances in absolute values (so, the “smoothest” path between chords[0] and chord[1]).
- If two transitions have the same sum, I can pick at random, for now.
- Once all of this is done, I can add the relative distances to voicings[0] to create voicings[1]. So, if the distance between the first element of chords[0] and chords[1] is -1 and the first element of voicings[0] has a MIDI note of 60, the first element of voicings[1] should have 59.
- Once voicings[1] has been fully created (as an ordered list of four MIDI note numbers), I should send the full ordered list to the first outlet and each MIDI note number to a separate outlet.
There was some back-and-forth involved, obviously. Still, I quickly ended up with a working script in (commandline) Lua. In fact, it accounted for chords of different sizes by doubling some of the chord tones, which is something I wanted to implement later.
Prompting me for further steps, it got me to ask about voiceleading a whole progression in altered Roman numerals… which worked right away with some common chord qualities. I then converted the chord dictionary I had from TonalJS (in intervals, like 1P 3M 5P 7M) into ordered pitch class sets (0 4 7 11). And integrated the whole thing into two PdLua objects (one to convert chordnames into ordered pitch class sets, the other to do the voiceleading using an incoming voicing in MIDI note numbers as a reference).
Along the way, I’ve lost a couple of things (that I can retrieve, fairly easily). For instance, the current version isn’t as effective in dealing with differently-sized chords as one of the earliest ones. Shouldn’t be hard to integrate the old code into the new version.
Still… I have something that I can actually use. And many ideas for improvements.
One of the main things will be to convert my “prog gen” code to work in Plugdata. Shouldn’t be exceedingly hard once I figure out a format for transition counts that (Pd)Lua (or Plugdata itself) can process. Since LLMs typically don’t have the kind of data needed to work in a patching language, it might be easier to do in Lua, for now.
Once I have that, I’ll be able to have a continuously playing chord generator within a plugin patch.
The voiceleading algo is key to how satisfying this will be (and already is).
And that’s thanks to that Sonic Pi gist.
Cool!