A friend and I are looking into building an external controller for Sonic Pi for live performance that’s tactile, but doesn’t involve typing text. My initial idea is to have buttons and encoders modify specific parts of a buffer template by filling in text that is associated with the button-push or encoder-increment at that control element’s designated line number and column number location in the text buffer. A different button push would effect the reload.
For example, say a template starts with a
live_loop statement at line 10, and and
end at line 20, and a
sleep 1 at line 19. Pressing a particular button could insert a
my_function_that_contains_a_sleep() call at line 18 and comment out the sleep at line 19. You make other changes, then you hit the reload button when you’re ready.
This approach would make developing and debugging easy because, after an initial template setup, the state of the control elements (button on/off lights, encoder readouts, say -100 to 100) would represent exactly the buffer’s state. The goal would be to stuff a Raspberry Pi inside a box that has the control elements, hit start, which loads the template and runs its in the buffer, and voila! you’ve got a Sonic Pi MIDI controller.
My friend is the hardware person, not me, but does this sound feasible? I realize that operating SPI strictly from a limited number of control elements would limit how much of SPI’s operation you could use, but with clever library functions written for different buttons, I think you could do a lot to sequence external MIDI gear in a quick and reliable way that’s would be very musically useful.