LiveOSC (Sonic Pi -> Ableton Live)

I’m still pretty new to Sonic Pi, but the first thing I needed was to try and record live_loops to tracks in Ableton. My goal was to sync BPM and start recording with tracks armed. I couldn’t find this else where so thought I’d share what works so far on a Mac (not sure about Windows, but believe it works the same):

UPDATE: Judging by a comment below, if you aren’t on a Mac you may need to install Python for this to work. https://www.python.org/

Install LiveOSC into the Live app on your system:

This receives on port 9000 and sends on 9001 (which is hard to find).

NOTE: You need to receive on port 4560. The LiveOSC command for that is:
osc "osc "/remix/set_peer", "", 4560

A list of available LiveOSC commands are here:

The only thing that was missing so far is the global record so I needed to just map the record button to a midi_cc.

My test session had 4 live_loops. I also wanted to record midi to two tracks. I set up a define for this, like so:

define :osc_setup do
      use_osc "192.168.1.6", 9000
      osc "osc "/remix/set_peer", "", 4560
  
     osc "/live/tempo", current_bpm
     osc "/live/arm", 0, 1
     osc "/live/arm", 1, 1
     osc "/live/arm", 2, 1
     osc "/live/arm", 3, 1
     osc "/live/arm", 4, 1
     osc "/live/arm", 5, 1
end

use_bpm 60

osc_setup
midi_cc 1, 1, channel: 9

To record multiple tracks at once I needed to setup a multichannel audio source using Loopback by Rogue Amoeba (a flexible aggregate audio builder). I used a normal “pass-through” as source with 10 channels and routed everything but the default 1,2 stereo channels to 4 other output channels like this.

Choose this as your mac audio source input
image

In Ableton you will also need to set this for your audio input:

And choose LiveOSC from your control surface under Link/Midi:

From inside each live_loop you can then use with_fx :sound_out_stereo, channel [channel you want to send to - remembering that with stereo they come in pairs - so I use 3, 5, 7, 9 for my 4 tracks.

You can also send midi out from the loops as needed. I use the IAC Driver on my mac (which wasn’t enabled by default):

With this when you run your code, it will start Ableton recording the tracks it arms. You still need to stop recording in Ableton, but it will start recording from the beginning.

This is pretty rudimentary, but a good starting point.

Oh, and you can also control params on devices and all kinds of other things. This was one of my main reasons for trying to learn Sonic Pi - even though it is incredible on its own.

There is also a Python library (pylive https://github.com/ideoforms/pylive ) that allows python control of Live - but what seems to be missing is the runtime engine available in Sonic Pi.

But this allows for some serious generative audio, midi and parameter control via Sonic Pi.

ONE CAVEAT SO FAR:
I thought I’d be able to receive these commands using the Max 4 Live “Connection Kit” which contains TouchOSC for learning/mapping OSC sends to parameters. This works much like @robin.newman shared in a previous post ( view it here ). However, for some reason I can’t receive any commands sent at this point in TouchOSC, even when set to port 9000 - so I assume that LiveOSC is commandeering that port.

UPDATE on reading OSC responses back in Sonic Pi:
If you send an OSC command that is just expecting a return - you have to actually read the cue log to see what OSC path/command it’s actually sending back then use ‘get’ from the Time State.

Example:
##| requests info from track zero, device zero
osc “/live/device”, 0, 0

##| returns all device parameters with names and values (it seems the OSCAPI doc is a bit incorrect here)
devices = get "/osc*/live/device/allparam"
puts devices

That will output an array into your log so you can find the exact parameter to manipulate.

1 Like

First thanks for sharing your exporations !

Which version of Ableton Live do we need ?
Does the lighter version, Ableton live Intro is sufficient ?

Hi Isaac,

I don’t work with Ableton but I really appreciate this thorough documentation you have shared here!

The context of why I am generally interested in interfacing Sonic Pi with external gear/software: I meanwhile have a Fates, which is a Raspberry-based music platform originally developed by Monome and its community. Currently I am trying to marry Sonic Pi live coding with the different applications which Fates provides. So one of the main goals here is to clock Fates with Sonic Pi, which proves to be kind of tricky (although it seems to be simple).

Nice article. This was my previous post about using Ableton with SP

@nlb - I’m running the full version of Live 10, but I think it should work with the Lite version. LiveOSC also has two different forks on Github for 9.6 or later and also for prior. TouchOSC is a Max 4 Live device so you need the full version to access that (as Max iisn’t included in the lite version).

If it does work in the Lite versions I’m sure people would like to know. If you try it, please share your findings!

@Martin. Thanks! And that Fates looks like fun - now I want one! I need to dig deeper into that. It should be noted I suppose that yeah, the syncing is something I’m not even totally sure about yet with Ableton. As I’m just for now looking at recording and manipulating, the above works - trying to perform it all might be another story. As I’m still learning SP in the first place I haven’t gotten that deep yet. But algorithmic control of Ableton Live via SP code is definitely intriguing to me. Max is just not the way I envision code - at least for most of what I want to do. I’m not a super coder either but am familiar with many languages. Obviously using something more standardized is very powerful.

1 Like

Well bad news…

Tested with Adobe Live 9 Lite:

it seems not to work.

Am i correct with the set up in Ableton ?

the osc messages are well received (checked with another program)

EDIT : python must be installed ? Start of headhache :slight_smile: under windows 10

Interesting. Yeah, the Live setup looks accurate. I failed to mention the need for Python (forgetting the LiveOSC is python based). Python 2.7 is default on Macs and I had actually upgraded to 3.8 when installing PyLive, so I had that dependency already. I’ll edit my notes above though. Sorry if I steered you down one of those hurdle paths.

And yeah, worse, testing inside (other than trial and error) is tough without Max 4 Live and some of the tools in their connection kit (which mentioned in my caveat, don’t register with LiveOSC). TouchOSC is great for mapping any commands from SP to various things as @robin.newman showed, but I was most interested in the way I could have easy command control for recording at this point. You can’t for instance map TouchOSC to the transport record button. For algorithmic ‘knob twiddling’ and everything else it would be great. Another ideal would be if you do get it working with the Lite - but it is possible the developers realized that it was opening up the door to getting around a full purchase and stopped OSC support on it. I’ll try and do some research into that.

One thing @nlb - Here are my MIDI/Link settings:

  • I don’t believe the Link features (not in your Lite version) matter - that’s for Midi.

It is interesting though that under “input” it shows the greyed out “touchAble”. I’ll have to look into this, but again might be a default only included in the full Live Suite.