I’m still pretty new to Sonic Pi, but the first thing I needed was to try and record live_loops to tracks in Ableton. My goal was to sync BPM and start recording with tracks armed. I couldn’t find this else where so thought I’d share what works so far on a Mac (not sure about Windows, but believe it works the same):
UPDATE: Judging by a comment below, if you aren’t on a Mac you may need to install Python for this to work. https://www.python.org/
Install LiveOSC into the Live app on your system:
This receives on port 9000 and sends on 9001 (which is hard to find).
NOTE: You need to receive on port 4560. The LiveOSC command for that is:
osc "osc "/remix/set_peer", "", 4560
A list of available LiveOSC commands are here:
The only thing that was missing so far is the global record so I needed to just map the record button to a midi_cc.
My test session had 4 live_loops. I also wanted to record midi to two tracks. I set up a define for this, like so:
define :osc_setup do
use_osc "192.168.1.6", 9000
osc "osc "/remix/set_peer", "", 4560
osc "/live/tempo", current_bpm
osc "/live/arm", 0, 1
osc "/live/arm", 1, 1
osc "/live/arm", 2, 1
osc "/live/arm", 3, 1
osc "/live/arm", 4, 1
osc "/live/arm", 5, 1
end
use_bpm 60
osc_setup
midi_cc 1, 1, channel: 9
To record multiple tracks at once I needed to setup a multichannel audio source using Loopback by Rogue Amoeba (a flexible aggregate audio builder). I used a normal “pass-through” as source with 10 channels and routed everything but the default 1,2 stereo channels to 4 other output channels like this.
Choose this as your mac audio source input
In Ableton you will also need to set this for your audio input:
And choose LiveOSC from your control surface under Link/Midi:
From inside each live_loop you can then use with_fx :sound_out_stereo, channel [channel you want to send to - remembering that with stereo they come in pairs - so I use 3, 5, 7, 9 for my 4 tracks.
You can also send midi out from the loops as needed. I use the IAC Driver on my mac (which wasn’t enabled by default):
With this when you run your code, it will start Ableton recording the tracks it arms. You still need to stop recording in Ableton, but it will start recording from the beginning.
This is pretty rudimentary, but a good starting point.
Oh, and you can also control params on devices and all kinds of other things. This was one of my main reasons for trying to learn Sonic Pi - even though it is incredible on its own.
There is also a Python library (pylive https://github.com/ideoforms/pylive ) that allows python control of Live - but what seems to be missing is the runtime engine available in Sonic Pi.
But this allows for some serious generative audio, midi and parameter control via Sonic Pi.
ONE CAVEAT SO FAR:
I thought I’d be able to receive these commands using the Max 4 Live “Connection Kit” which contains TouchOSC for learning/mapping OSC sends to parameters. This works much like @robin.newman shared in a previous post ( view it here ). However, for some reason I can’t receive any commands sent at this point in TouchOSC, even when set to port 9000 - so I assume that LiveOSC is commandeering that port.
UPDATE on reading OSC responses back in Sonic Pi:
If you send an OSC command that is just expecting a return - you have to actually read the cue log to see what OSC path/command it’s actually sending back then use ‘get’ from the Time State.
Example:
##| requests info from track zero, device zero
osc “/live/device”, 0, 0
##| returns all device parameters with names and values (it seems the OSCAPI doc is a bit incorrect here)
devices = get "/osc*/live/device/allparam"
puts devices
That will output an array into your log so you can find the exact parameter to manipulate.