MIDI input mapping


(reposting from https://www.patreon.com/posts/interactive-live-19425885 since responses seem to be easier to get/gather/manage here in discuss)

I’m also really interested in the links between midi and Sonic Pi; toggling loops using something like the novation launchpad, tuning loop speeds with a nanokontrol, etc.

It seems like mapping could be a huge undertaking. Would be great to have a way to conceptually link a timing/amplitude/cutoff value to a device just by hitting some sort of “learn” editor function and turning a knob (akin to how reason handles assignment).

Microcontrollers and sensor input with Sonic PI was a great post to get me started at least.

I’d love to be able to achieve a modular-synth type flow (e.g., https://www.youtube.com/watch?v=2_ZKPabkd2I) where a variety of tune-ables are available without a lot of code overhead.

Am I missing something or is the MIDI system still in too early a stage for this sort of thing?