Is it possible to control via Sonic Pi an instrument in Ableton?

Perfectly clear! The OSC is so general that we need a kind of interpreter between the sender and the receiver. Does that mean that I can use any OSC “name” like live/my_release as long as I give the interpreter the signification of my_release?
Thanks again @Bubo
P.S. too funny, french is my speaking language too! :slightly_smiling_face:

I think @Bubo has answered most of your questions. He is perfectly right that OSC may be more involved because you need some mechanism at the receive side to recogise the “address” like /live/release in your example above, and then to extract the parameters following this in the OSC message and direct them to do what you want. In my video I used different messages to send parameters which were then connected by the max for Live plugin to the plugin values I wanted to control. I just used this part of Max for Live because it was there, to perform this specific task.
For merely sending note information pitch duration etc midi is perfectly acceptable. Where OSC can be better is if you have situations where you want a wider range of possibilities than the 0->127 range of midi parameters. Midi allows greater flexibility in the pitch bend control for example which allows finer steps, but this is the exception rather than the rule. With OSC you can use anything you like in the messages eg data in the range 0->1000 if you want. I also find OSC very easy to implement at the Sonic Pi end.
For controlling a DAW midi may well be easier.
One other point is that Sonic Pi uses OSC messages very heavily for internal communications, and in fact it converts incoming midi to OSC format messages to use it!

Works with Reason as well! Made a post on this topic:

As Ableton were my entry-point into musicproduction and it’s still my DAW of choice. I truly appreciate the information in this thread and it will surely come to use. Thanks!

1 Like

Hello fine folks!

Has anyone noticed timing issues when sending MIDI messages to their DAW via IAC or other virtual cable?

I hear audible jitter when sending a steady stream of 16th notes at 100bpm to Reaper, on a fresh reboot with no other apps open and bluetooth and WIFI turned off. use_real_time / use_sched_ahead_time makes no discernable difference. This is for a single channel of monophonic 16th notes via IAC on an i7 2015 MBP.

It’s my understanding that some amount jitter is unavoidable when using a virtual cable – or a physical one for that matter – but what I’m getting is unusable. I plan to do some digging, comparing jitter from SP -> Reaper vs Reaper -> Reaper on both Mac and Windows, but thought I would ask in here as well.

Alternately, does anyone know of a SuperCollider synthdef that will output a continuous CV-like audio signal?

Are you triggering a special MIDI channel? I’ve been experiencing strange behaviors when I was neglecting to precise a specific channel for each stream of notes:

midi :c2, channel: 1

It’s difficult to say precisely where the problem comes from in your situation. Can you tell us more about your config and about how you are routing from SP to your DAW? You should always have a steady stream if you’re not doing anything CPU intensive.

After more testing, the jitter seems to be introduced primarily by the IAC bus itself. I sent some 16th notes sequenced in Reaper out via IAC and they came back with the same drifting pattern. How irritating!

Argh, that’s annoying! Happy that it’s not Sonic Pi though :wink:

Leaving this breadcrumb for posterity: I have identified this as a Reaper issue. https://forum.cockos.com/showthread.php?p=2118091#post2118091

I did some more digging, using ReceiveMIDI to log raw MIDI events with system timestamps, and a JS script I wrote to analyse the output.

Sonic Pi’s MIDI timing on Mac is excellent. Letting this loop ~500 times

use_midi_defaults port: 'usb2.0-midi_port_2', channel: 1
use_bpm 75

loop do
  midi_note_on 60
  sleep 0.125
  midi_note_on 60
  sleep 0.125
end

with MIDI signal path Sonic Pi MBP/OSX -> cheap MIDI<>DIN cable -> BCF2000 DIN input -> USB cable -> Win10 PC running receivemidi,
here is the histogram of durations in ms between the resulting MIDI events:

97: 3,
98: 37,
99: 196,
100: 510,
101: 203,
102: 36,
103: 1

That’s pretty solid, and more importantly, inaudible.

In Windows, however, it’s pretty ugly. Same code, Sonic Pi on Windows 10 PC -> LoopMIDI virtual cable -> receivemidi:

41: 1,
74: 1,
76: 1,
79: 1,
89: 3,
90: 16,
91: 40,
92: 76,
93: 89,
94: 130,
95: 91,
96: 49,
97: 15,
98: 10,
99: 16,
100: 21,
101: 29,
102: 14,
103: 3,
104: 14,
105: 41,
106: 55,
107: 70,
108: 37,
109: 37,
110: 48,
111: 43,
112: 24,
113: 6,
114: 9,
115: 2,
116: 5,
117: 2

This jitter is very much audible, and like sandpaper on my brain.

I did a test to make sure the virtual midi cable wasn’t the source of jitter, with Reaper Win10 sequencer -> Loop MIDI virtual port -> receivemidi:

98: 1,
99: 297,
100: 589,
101: 7,
102: 2,
103: 35,
104: 46 

Not quite as tight as OSX via physical MIDI cable, but nowhere near as bad as SP on the same Windows machine.

I’ll be looking into what might be causing this discrepancy across OS’s but @samaaron, do you have any idea what it might be?

Super useful data, thanks!

Great to hear that timing on macOS is solid. This MIDI timing is controlled by an Erlang scheduler written by the dear and late Joe Armstrong. There’s quite likely to be some discrepancy on how precise the timing is depending on the OS-specific APIs used in the Erlang code. I’m positive that this can be fixed. Joe also mentioned that a busy-waiting approach would eat a tiny bit of CPU but increase the accuracy even more on macOS.

To test my theory, would it be possible to do a similar timing experiment using OSC in a loopback rather than MIDI? If you see similar results (macOS tight and Windows jittery) then that would back-up my hypothesis.

@samaaron Confirmed, when logging OSC messages I’m seeing more or less the same distribution on Windows, and only +/- 1 ms variance on Mac.

1 Like

Wonderful, thanks for letting me know :slight_smile:

@samaaron, I’m trying to debug this. I installed erlang on my windows machine, added some logging to pi_server.erl and successfully recompiled py_server.beam. However I get the following error when running SP, even when undoing my changes to the .erl file and recompiling again:

init terminating in do_boot ({undef,[{pi_server,start,[[_]],[]},{init,start_em,1,[{_},{_}]},{init,do_boot,3,[{_},{_}]}]})

Which makes me think that perhaps the erlang version I installed globally (21) and used to recomplile is not the same as whatever version Sonic Pi is using to run these files.

Where does the SP erlang runtime live?

erlang location stuff is in the util.rb file in sonic-pi/app/server/ruby/lib/sonicpi

On my build on a mac I set the location in the routine

    def erlang_boot_path
      case os
      when :windows
        erlang_bin_path = "\"#{File.join(native_path, "erlang", "bin", "erl.exe")}\""
      when :osx
        erlang_bin_path = File.join(native_path, "erlang", "erl")
        "\"#{ruby_path}\" \"#{erlang_bin_path}\""
        # Uncomment this if you want to use the system Erlang:
        #"erl"
      when :raspberry, :linux
        "erl"
      end
    end

to suit (I set it up to use my system erl).
On windows the info above may be useful to you.

So, it would seem that the jitter I’ve been experiencing in Windows is a result of the limitations of Erlang + Windows

The resolution on windows is even worse than that. The source is GetTickCount()/GetTickCount64() which have a resolution of 16ms if I remember correct.

My own tests corroborate this:

0> lists:foreach(fun(X) ->
  First = erlang:system_time(nanosecond)/1000000000,
  timer:sleep(X),
  Second = erlang:system_time(nanosecond)/1000000000,
  TimeDiff = Second - First,
  io:format("~p called\n~p slept\n\n", [X, trunc(TimeDiff * 1000)])
end, lists:seq(1,50)).

1 called
16 slept

2 called
16 slept

3 called
14 slept

4 called
16 slept

5 called
16 slept

6 called
15 slept

7 called
16 slept

8 called
15 slept

9 called
15 slept

10 called
16 slept

11 called
15 slept

12 called
16 slept

13 called
16 slept

14 called
14 slept

15 called
16 slept

16 called
30 slept

17 called
31 slept

30 called
32 slept

31 called
31 slept

32 called
46 slept

33 called
47 slept

34 called
45 slept

35 called
47 slept

36 called
46 slept

37 called
46 slept

38 called
46 slept

39 called
47 slept

40 called
46 slept

41 called
47 slept

42 called
45 slept

43 called
47 slept

44 called
46 slept

45 called
47 slept

46 called
47 slept

47 called
46 slept

48 called
61 slept

49 called
63 slept

50 called
61 slept

Bummer.

Intriguing. This might shed some light too: http://erlang.org/pipermail/erlang-questions/2011-May/058940.html

Perhaps ErtsMonotonicTime might be something we could use: https://github.com/erlang/otp/blob/master/erts/emulator/sys/win32/sys_time.c

Essentially, if the granularity of the timing on Windows is 16ms then I consider that a serious bug and will endeavour to fix it (although a fix may or may not make it to the next release).

Hey, i know im kinda late to this but this seemed to be the right place to ask. Sending my midi singals into a synth in ableton is working for me, but it just plays the notes completely out of time and in random bursts without any apparent rythm. Is this just because my computer is struggling to keep up with the midi signals or am i doin something wrong?

Hi @emil, which version of Sonic Pi are you using and on which OS?