Aquatic ambiance with sonic pi, hydra, and external midi devices

what’s up fellas
thanks to this awesome forum community, i was able to duct tape together this proof of concept using sonic pi to control my sp404, a korg minilogue, and the hydra visual synth all at once

duct tape is not an understatement, i struggled to put this code together. the last time i worked with code was like 10 years ago, so please excuse the bloat. i learned a lot about sonic pi and how midi technology works in the process. i know it’s not pretty but i need to share it first to ask some questions to you all that are way more experienced with sonic pi than me

Here are some problems/things I learned putting this all together, I welcome feedback and ideas to optimize this process

1- It was annoying to have the midi ports change name, seemingly randomly, every couple of hours, so I had to go back and change the number on dozens of lines of code. I figured out how to use “use_midi_defaults” and set my minilogue as the default because that’s the one with the most lines of code. Is there a way where I could save the value of different midi outputs in different variables? and simply call on those values so the lines look more like: midi_note_on [note], device: [value that contains both port number and channel]? This would be extremely useful for how the sp404 works with midi.

2- It takes me way too many lines of code to program these arpeggios. What is your recommendation for compacting the code but keeping the value of the sleep commands flexible for the different note durations that this song requires?

3- Is it possible to push API commands to Hydra from Sonic Pi? or to trigger an external local script from sonic pi, so that I can change the hydra code from outside of hydra? This would be awesome for eventually transitioning from one scene to another scene without having to copy paste code into Hydra. I know I would be asking for too much at this point. Are there any other visual synth alternatives that work well with Sonic Pi? I was thinking of possibly integrating OBS into this set up so I can have a dynamically changing background. Curious to hear if anyone else has gone in this direction.

Sorry if this thread feels more like a support thread than anything else. Feel free to move it there if so? I just wanted to highlight and prove that you can use Sonic Pi as a tool to compose both music and visuals at the same time, using external devices. I will continue trying to optimize it all and hopefully I will have more to share with you in the future :partying_face:

Provided that you ar using different channels for different devices, to separate them, you can always use a wild card * for the port name, in which case the midi signals will be sent to all connected ports.
EDIT
by the way, I see you are using pairs of midi_note_on and midi_note_off commands. Much easier to use midi <note>, sustain: <time> as this automatically creates pairs of note on and note off commands separated by the sustain value.

1 Like