The percussive bell like quality of this piece was ideal for driving a visualiser and I decided to give it a go. First I had to type in the code from the tweeted video. I then transcribed it into MuseScore3 as the rhythms were quite complex, involving time signature changes and incorporating one triplet. (beatifully coded by Dan).
From the score I picked out the notes at the start of each bar (bl) together with the notes at changes in the overall melody line (mel). Finally I picked out the prominent notes in the bass accompaniment (bass). These three lists then generated three hashes, which were interrogated by an incrementing “tick”. When a positive mactch occured, a midi_cc signal was sent from Sonic Pi to the visor.live app, and this was used to switch on and off various graphical elements. I arranged for the piece to repeat a determined number of times: in the video it is three, and then because the tune never resolves to finish I added a final chord to finish it nicely.
On the visor.live side I used version 0.4.0 and worked with a polygon shape. I used one large polygon in the centre, and a 10x10 array of smaller ones to fill the screen. I added facilities to alter colours, and rotate these independently.
It was an interesting project to do, involving a whole lot of problems to solve, but I think the end result is quite nice.
Here is a video of it in action:
I have posted all the code in a gist if anyone wants to try it out. But be warned, it needs quite a high end machine as very resource intensive.