Is it possible to control via Sonic Pi an instrument in Ableton?

Is it possible to control via Sonic Pi an instrument in Ableton? I just discovered Sonic Pi and I love it! I would like to control by code an instrument/synth in Ableton (to enjoy the richness of the instruments and to exploit the packs that I already bought!). I saw something from @robin.newman who shows that it’s possible but Robin talks about Max for Live and it seems a little bit complicated (and my installation of Max for Live does not work!). Is there anything else I can do?
Thanks for any help or hint or link,

1 Like

There is many techniques you can use to do so. The easiest one in my opinion is to send midi notes and midi control parameters. Here is a tiny bit of code :

live_loop :notes do ; tick
  midi (ring :c4, :eb4, :f4, :g4).look, channel: 1 # MIDI NOTES 
  midi_cc 0, (line 50, 100, steps: 40).look, channel: 1 # SENDING VALUES ON CC_0
  sleep 0.25

In your Ableton session, select the right MIDI port and channel, and select “IN”:


Then you can use different methods to assign CC parameters to any parameter you would like to tweak on your Ableton synthesizer/sample/anything. I really like the CC Param Control Bank plugin because it’s very easy to use and to modify on the fly when playing. It’s a Max for Live device, but you can normally use it without having installed or without owning a Max license.

It looks like this:

Tell me if you need more detailled instructions.

Be sure to check the included tutorials and documentation because there is a ton of useful stuff to learn from these ressources. You have to be careful about giving each midi event a channel, because otherwise, Sonic Pi can be very laggy and even crash.


Thank you so much @Bubo, I will try it as soon as possible!

Could you explain me the general principle of this technique? Is this correct:

1- Sonic Pi send midi messages to outside world via a specific channel (say 0) (should I also specify a port somewhere?)
2- IAC driver “reroutes” this message (I suppose it means that it gives the possibility to another app to load the data)
2- Ableton Live connect to IAC driver and loads the data (I suppose I have to specify channel 0 on my track right?)
3- On my track, I add the CC Param Control Bank first and then my instrument. After that, I map each of the control commands to any mappable parameter of my instrument.

Am I correct?
Thank you again,

Hi @Patrick,

I can’t say anything about the IAC-driver or Ableton, maybe this is helpful. but concerning to your 1. question:

As soon as you’ll specify and fire up a midi command in Sonic Pi, this will get send by default on all channels; you can restrict this to a specific channel (such as channel: 3). You can see this in the cue log window (if you don’t see that, go to “Preferences > Editor” and check “Show Cue Log”). Make sure you do see these messages and also: Check “Preferences > I/O > Midi Ports”; you should see Midi in/Midi out ports. If not, there is something wrong and Midi won’t work. This is - as far as I know - also where you should see the IAC-entry (I think that’s kind of a virtual midi device on Mac OSX (or are you on Windows??))

The cue messages will contain something like this (this is also the format to address a specific device)

# first part of midi cue message send to a hardware device recognised by Sonic Pi
# if you send midi to specific devices you can also use wild 
# cards if port/channel is not important
# the general format is:

I think for a start it should be sufficient if you configured Ableton to listen to ‘Midi Through’ and then just send some Midi via Sonic Pi.

There is certainly much more to be said about this subject.

  • Check out @robin.newman’s blog; he has worked quite a lot with Sonic Pi and Midi and documented lots of thinks in a very helpful way.
  • Do also check this forum for e. g. “Midi” or “IAC”, as you will probably find more information about this (e. g. I found the afore-mentioned blog article about the IAC-driver via search here).

Hope, that helps a bit.


1 Like

Thank you @Martin for your help on this topic! Concerning the IAC Drivers, these are virtual MIDI buses, used on MacOS to send MIDI data between applications. I think that every OS must have something approaching. I’ve done a few google searchs for you and here is what I found:

  • For Windows: LOOP Midi. This link will take you to what seems to be the more trusted utility for creating virtual midi ports.
  • Linux (Ubuntu, etc…): correct me if I’m wrong, but I think that virtual midi ports are not necessary on Linux systems, and that it should be rather easy to configure something ? (I do not know enough to be able to answer for Linux systems…)

If your computer is running MacOS, you just need to activate the IAC ports by doing so: 1) search for MIDI config, pre-installed default utility. 2) In the toolbar, search for “Studio MIDI”, and then, activate IAC MIDI.

Yes, that’s definitely true. Once you got the sound on Linux running via Jack (that can be tricky), it’s - I think you could say - a piece of cake :wink:

Ah, and thanks for the input! Usually I am on Linux but I have also a Mac around and need to know something about midi configuration…

Just one more thing: Found a quite old but very helpful posting by @robin.newman which got me started on Midi:!topic/sonic-pi/z8xUqzqdos8

(Scroll the thread down a bit to find a very long and precise explanation; I could not find the direct link …)

thanks again my first try is a success!
Could you just tell me @Bubo where to find documentation about CC Control Param Bank? I did a search but found nothing. Presently I can hear the sounds generated by Sonic Pi but I don’t know how to map the CC.
Thanks again to you and @Martin

I don’t think that there is any precise documentation but the plugin is pretty simple when you have figured out what the different things are doing.


OFF : here select your CC value. The port is already defined on the track.
N | P | VS : Note, pickup, value scaling: You may have to play a bit with the plugin to figure out what these things are doing.
MAP : click on this, and then on the parameter you want to automate, the link will be created. If you’re using VSTs, you have to know how to create a parameter in the toolbar from the VST gui.


Click on Configure and then on the knob/value/parameter you would like to modify on the VST Gui. You will see that a new slider, here highlighted in green, is created in the Ableton track toolbar. You can know use MAP to link to this new slider parameter :slight_smile: .

0 - 100 : CC value scaling.

You can ignore everything else and you will be just fine. Tell me if you have more questions. You should now be able to take control of almost everything from Ableton and tweaking values from Sonic Pi.

Thank you so much, it’s working (a little bit) now! I still had no success with the exact values I want but there is definitely a progress! Strangely enough, with a 0-100 scale on CC 0, when I send 0 to 100 values from Sonic Pi to Live, the mapped parameter (I tried a few) goes from 0 to 79. No change if I change the 0-100 scale to say 0-50. I still don’t understand the mapping part.
Anyway, thank you again!

I’ve never seen something like that happen. I never had any problem with pushing values to the minimum or to the maximum. There must be an answer to your problem. Try different instruments, VSTs, methods to send values… It may take a few hours of experimentations before you feel comfortable using Sonic Pi this way.

I think it is expecting 0->127 (midi) input values from Sonic Pi. This is mapped 0->100 in Ableton.
That is why your value of 100 gives 79

You are sending a value which is 100/127 of full range which gives 0.7874 or 79 which is what you are getting.

I’ve not chipped into this thread so far as I think you are happy using midi.
I used OSC input from Sonic PI and used the bundled max for live that came with my ableton suite together with a free max connection kit
Some time since I did this (nearly a year ago!) but I can remember finding it a bit fiddly to setup. The interface was not very user friendly, although I got it controlling parameters in Live.
There is a video on youtube

Currently I’m involved with a project using Sonic Pi as an intermediary to allow Ableton Push2 to control the excellent Omenie St Just in Roseland Organ app using midi. Once I’ve got that finished and documented I’ll try and have a further look at controlling live parameters again.

Thank you @robin.newman, very kind from your part. You’re right about the mapping, I should have realized by myself that Sonic expects 0-127 values and not 0-100.

About OSC vs midi, what is the difference for you, is one more convienent than the other?
And if I can ask one last question: what’s the purpose to use Max for Live, is it to build your own synths/sounds? When you use Max for Live, can you send Midi messages from Sonic Pi or OSC is mandatory?
P.S. I appreciated your video on the Bach’s 14 Canons, very interesting (I look forward to see your code when you’ll publish it)
Best regards,

I will try to explain it very quickly, but I know that I’m ommitting details concerning the purpose and design of each of these communication protocols.

MIDI and OSC messages are two very different things. MIDI is used since the mid-1980s as a protocole for communication for synthesizers or digital audio stations. It evolved quite a bit since then but it’s still a straightforward system to send and receive very specific informations : a key being pressed, a knob being turned, a note / velocity / channel / port message, etc… Check out the official MIDI Association website to understand it better : Summary of Midi Messages.

OSC is not strictly focused on synthesizers, and is more “general”. It is really a protocol for communication between very different things (lights, sound, anything). You can use various names for the informations you are sending. The biggest difference is that you are sending these informations on a network, to a specific IP address through UDP / TCP protocols. You can think about it as a “modernized and more powerful” version of what MIDI was, but in my opinion, it’s not the same thing at all and the two approaches are equally valid and powerful.

You now have to choose between OSC and MIDI. I personnally think that MIDI is way easier to use for communicating between Sonic Pi and a synthesizer or DAW. However, if you are designing a complex system, you may feel the need to use the power of OSC messaging to deal with many complex informations.

Concerning Max for Live:

In the example above, Robin is using a Max for Live plugin to parse the OSC messages he received through his computer network. The purpose is not to produce sounds, but Ableton is producing sounds using the OSC informations sent by the M4L plugin.

Max for Live (M4L) is an host for plugins designed using the visual programming language Max. It’s a powerful specific language for audio-visual programming. You may be able to create any synth you think about using it but it’s something more to learn, and Max can be very time consuming when you begin using it.

Thank you so much @Bubo, your explanations are very clear, I understand more now. Can I ask you to clarify one thing though. When you say

In the example above, Robin is using a Max for Live plugin to parse the OSC messages he received through his computer network. The purpose is not to produce sounds, but Ableton is producing sounds using the OSC informations sent by the M4L plugin.

Why is it necessary to have this TouchOSC plugin? Why not have in Ableton the possibility to choose for a specific track Midi From, Audio Form or OSC From?

And last question, in @robin.newman’s example, where are sent the OSC messages received in Ableton from Sonic Pi ? In the Multi Synth instrument?

Thank you again to help a newbie like me!
Best regards,

It is not necessary to have the TouchOSC plugin. You can use the method I described at the beginning of this thread and you will be just fine. By doing so, you are indeed able to choose simply MIDI from [your midi port] and deal with that.

In @robin.newman example, the informations are received in the last plugin on the right and sent to the instrument on the left of the rack. You can see him initialize a few messages to control parameters of this rack-synth : /live/attack , /live/release , etc… @robin.newman, correct me if I am wrong, but he is mapping the values sent with the messages to some parameters of the synth on the left, like I taught you the other day.

Sorry @Bubo maybe my question was not clear (sorry english is not my first language). I meant, suppose you want to use OSC, is it necessary to use a plugin like TouchOSC to receive the OSC messages from the outside? If yes, I was wondering why it’s not possible in Ableton to simply have the possibility to choose in a track OSC From (like you have a Midi From). Wouldn’t it be simpler?

Don’t worry, english if not my first language either… French is.

Indeed, you always need something to parse OSC informations because they can be complex and always need to be routed or guided to some parameter or to something. For instance, when sending /live/release + [some_value], there is nothing in the OSC message that can be used by some given system to understand what these names are referring to. You always need to tweak a bit to direct the information you are sending to something.

On the other hand, MIDI informations are very strictly defined. If you send a note_on message, it’s a note_on and not something else.

Perfectly clear! The OSC is so general that we need a kind of interpreter between the sender and the receiver. Does that mean that I can use any OSC “name” like live/my_release as long as I give the interpreter the signification of my_release?
Thanks again @Bubo
P.S. too funny, french is my speaking language too! :slightly_smiling_face:

I think @Bubo has answered most of your questions. He is perfectly right that OSC may be more involved because you need some mechanism at the receive side to recogise the “address” like /live/release in your example above, and then to extract the parameters following this in the OSC message and direct them to do what you want. In my video I used different messages to send parameters which were then connected by the max for Live plugin to the plugin values I wanted to control. I just used this part of Max for Live because it was there, to perform this specific task.
For merely sending note information pitch duration etc midi is perfectly acceptable. Where OSC can be better is if you have situations where you want a wider range of possibilities than the 0->127 range of midi parameters. Midi allows greater flexibility in the pitch bend control for example which allows finer steps, but this is the exception rather than the rule. With OSC you can use anything you like in the messages eg data in the range 0->1000 if you want. I also find OSC very easy to implement at the Sonic Pi end.
For controlling a DAW midi may well be easier.
One other point is that Sonic Pi uses OSC messages very heavily for internal communications, and in fact it converts incoming midi to OSC format messages to use it!