Processing experiment

Hi, I continue my Processing experimentation

the code :

Enjoy !


This is great.

By glancing at your code, it seems that that everything is pre-planned and then executed when you run the program. Is that correct or are you making changes live in your Sonic Pi code to change the visuals?

I have experimented with using an OSC library for p5.js with Sonic Pi to have visuals react to changes in the code so there can be some live coding element to it. Unfortunately all of those changes must be preplanned in the p5.js code because you cannot make changes there once the program is running. The only changes are coming from the values sent from Sonic Pi and how those are set up to affect the visuals in p5. Im interested to know how other people deal with this limitatoin when using p5/Processing.

Yes I start the program and everything run to the end.
But in my first video

it’s keyPressed in processing who mute and unmute sonic-pi instrument and
send or not to P5 oscMessages.
There is a mode to code on the fly Processing :
Otherwise if you don’t predict your changes it’s not possible to
modify processing running.

Super cool stuff! I’m OK with the Processing not being changeable on the fly, as I prefer to livecode Sonic Pi and have the visuals react to that. Awesome.

Visor is worth looking at as an intermediary between processing and Sonic Pi. You code in Ruby, and you can alter the sketch whilst it is running and rerun just as you would in live-coding Raspberry Pi. You can send midi from Sonic Pi to affect the code too.

That second video is cool. The music I find interesting as well, nice one.