Some background, and a request for input from those more experienced

I am a rubyist, so when I realized Sonic Pi was in ruby, I knew I’d found something I think I could spend lots of time in.

For years now, I’ve been interested in the “bedroom DJ” sort of setup - low cost controllerism, effectively. Partly because I know this is a hobby that I may fall in and out of and I don’t want to spend hundreds or buy crappy equipment. Also partly because trying to convince my wife pouring hundreds of dollars into my hobby when I can’t even string a few beats together yet is a non-starter :rofl:

Recently, I discovered Unfa on youtube (link). I’m a CTO of a municipal government by day, but my true love is all things Linux/open-source.

So - in short, I’m a career programmer, newb to music production, and I’m cheap. Unfa’s guides to getting Reaper, Jack, etc are a perfect excuse to use ALL THE SKILLS, and I’ve been having a blast learning everything I can. I also discovered Output.com’s Arcade VST, and managed to get that working in Reaper via LinVST.

All that said, at this point, I’m finding it hard to take some of the samples I am creating from Arcade, and working them into some sort of live performance setup (my true goal - I just want to be able to have an itch and jam out one night, with a possible recording if I ever feel so confident). I’ve looked up several free native samplers using Midi control, but they’re all yet-another-layer-to-learn.

I know what I want in my head. And working through the tutorials on Sonic Pi and watching @samaaron here (I keep this on loop lately) , I also can generally see how I’d code it.

But I can also see a few areas where I’d want to tie in an OSC interface - for instance, I see Sam adjusting amp: params frequently (adjust, run, adjust, run, adjust, run back to back). I personally would like to create simple helper methods/singletons that allow me to set these sorts of parameters with dials - a hybrid live coding/device performance, if you will.

For instance, I could set each buffer up to have a slider in Open Stage Control (O-S-C) that is synced to a shared state in each buffer. I could wire up virtual X/Y pads to FX params, and while live coding, swap in FX settings to other parts, etc. Sample management interfaces, etc, so I can use general keywords like “@melody” as opposed to a sample name, while using an interface to swap out which melody is represented by that keyword. These may all be awful ideas, but I know until I try them I won’t get rid of my “itch” :stuck_out_tongue:

My head is swimming in very interesting (to me) ideas like this.

I haven’t been able to devote the time lately to dive deeper into Sonic Pi’s ruby capabilities, but it all makes me wonder just how much ruby I can use in a Sonic Pi run?

For instance, can I install gems and require them and use them within Sonic Pi? Obviously it’s not going to be very portable/sharable - but my goal, for now, is not to worry about that so much as it is to make music I want to listen to.

I don’t have a lot of questions here, but would love any thoughts, feedback, or insight on my setup/ideas above from those more experienced in Sonic Pi and music production in general!

Hi Keith,

I really hope that Sonic Pi can help you towards your goals. From what you describe, a lot of the functionality is already there - you can already easily add event handlers to react to incoming OSC messages and reliably (and deterministically) share messages across threads using the time-state system via get and set.

This means that manipulating any aspect of Sonic Pi via an external OSC/MIDI controller is already both possible and pretty easy.

With respect to supporting arbitrary Ruby - it’s important to point out that whilst Sonic Pi’s runtime language is currently built with Ruby - it’s not Ruby. It’s mostly a subset. Whilst other aspects of the Ruby language may work and we don’t discourage you from using them - they are not supported. In fact, if the Ruby feature you’re using is not in the tutorial, then it may either not work as expected in weird ways or suddenly stop working without warning in a future release. You’re therefore very much on your own. As part of this statement, generic Ruby gems are definitely not supported in any way and are unlikely to be in the future.

This is for a few important reasons:

  1. With our current resources it’s impossible to provide support for the interaction with Sonic Pi’s internal functionality and all of Ruby. The surface area and interaction possibilities are just too large and so much can go wrong. By reducing the surface area to the code within the tutorial, we can test and support users much more reliably.
  2. We may move away from Ruby as the implementation language in the future.
  3. We want to keep the language small and focussed to make it easy to document, learn and teach.

Having said all that, if you do find you really wish that a specific gem was available to you, please do let us know. We’re always interested in hearing about functionality that Sonic Pi doesn’t currently support so we can try our best to keep new features relevant and useful going forwards. It’s also possible that what you’re trying to achieve with that gem is already possible out-of-the-box.

Also, if you’re really desperate for a “full-blown” programming language then I can definitely recommend both Overtone which gives access to all of Java’s ecosystem and SuperCollider which gives you a lot more control over the synthesis capabilities. Both are wonderful, are more ‘powerful’ than Sonic Pi in a feature sense, but both are much harder to learn and more complex.

Personally, I find Sonic Pi’s balance between simplicity, focus-on-education and synthesis power to be perfect for my kinds of musical goals. However, everyone is different and you might find that’s not true for what you want :slight_smile:

This is absolutely my feeling so far (that almost everything I want to do will be supported out of the box). I don’t think I’d need much else, I was just wondering if it was designed to be more of a sandbox for the ruby runtime with limitation, or if it was simply just ruby under the hood.

Totally understand and support that. I also LOVE how focused on education this is, and will reach out to you soon about that. I think I’d like to spearhead some initiatives locally here with your blessing :wink:

I think it will be immediately popular in the schools I’ve worked with locally. But I digress :stuck_out_tongue:

Will do! I haven’t come across anything yet that I’d say I “gotta” have, but I’ll keep my eyes open.

My original curiosity was mostly around the question: “Can I build my own helper modules via a class definition and such?” - pure ruby only sort of things that shorten up my code/increase readability/etc.

Just to be sure we’re chewing on the same apple, I fully support those views. I read a github ticket recently where someone was really trashing Sonic Pi (in a thinly veiled way) and I just found myself disagreeing so much.

I can see that vision while watching what you’re doing on-screen in a live performance - it’s pretty easily understandable and followable - especially since you’re sticking to everything seen in the tutorial. So count me as a fan of that vision - I just wanted to know how much trouble I could get into before diving down a thinly-documented-and-not-supported-rabbithole :rofl:

2 Likes

EDIT: PROCEED AT YOUR OWN RISK!

Using a live loop, I reopen the class and redefine the singleton method “ping”.

To get at the “sample” keywords of the Sonic Pi DSL, I pass in (self) as the context (lots of better ways to do this but I just wanted to see).

Now, if you use Hello.ping(self), it will play the sample. If you change the amp: param inside the singleton, everywhere you use that code changes.

Obviously I don’t think we should probably liveloop every 0.1 seconds just to reload the class definitions, but as you can see, there are some simple ruby tricks/patterns/anti-patterns that one could use.

Fun!! This will probably spell disaster for me later, but I couldn’t help myself :stuck_out_tongue: Had to try!

EDIT: I promise not to submit support requests for any of this when it all just breaks for no reason, haha!

# https://gist.github.com/KeithHanson/a9f4c41150ae89696ee3502ef311867d

# Welcome to Sonic Pi
bpm = 90

live_loop :reloader do
  class Hello
    def self.ping(context)
      context.sample :elec_ping, amp: 0.3
    end
  end
  sleep 0.1
end

live_loop :metronome2 do
  sync :reloader
  
  use_bpm bpm
  Hello.ping(self)
  
  sleep 1
end

Keith, keith, keith… shakes head

Shame on you.

You do realise this is like leaving a loaded B.F.G
at the respawn point in Duke Nukem, with the words
‘SHINY NEW TOY’ floating above it.

Sam, Xav, Robin and a few others would take it and gently murderise
everyone else on the game-server… and then there’s people like me:

’If it moves… shoot it. If it doesn’t move,
shoot it in case it decides to move’.

I’ll be blowing holes in space time as fast as
the screen refreshes… it won’t be pretty.

Eli…

:rofl: :man_shrugging:

Well, I just had a really nice six hour session with Reaper/O-S-C/Sonic Pi, and I’m really really impressed.

Not only is this really good at what it does, I’m learning a ton from the tutorial about all these sound design-ey things that are just harder to understand when every VST has a different way of displaying it. Going back to the VSTs now I’m recognizing all the parameters and effects SP uses and it’s clicking :+1:

I AM seeing some gaps I want to write helpers for. I just discovered sample’s slice: option, and having done a little OSC communication to Open Stage Control, I can see some opportunities to say, create a helper method around loading a sample into a grid and chopping it up with a touch panel and seeing the feedback, or another helper for abstractly wiring up all the listeners for OSC events.

Will post if I end up abstracting something useful for anyone else to use :slight_smile: (and will add warnings if I use dangerous features of Ruby! :smiley:)

Hi @keith

I did some work about that working with Open Stage Control and TouchOSC; this is very old (and for sure bad) code as these were my first attempts in this direction and also you are a professional coder and my stuff is definitely on beginner level.

Nevertheless I just want to let you know that I am interested in SP applications going in this direction; one reason is: I really do like coding and am very curiously to learn more and the best way to get better is just to try - irrespective of the level you start from. The other is, that I planned to integrate this into my live coding practice.

I did also continue with these ideas using a hardware controller but currently I am a bit hesitant as to how much energy for me is worthwhile to invest.

Hahaha, fun!

I’d love to know what you’re thinking about doing with a Singleton class with this technique. It feels like you’re just using the global aspect of it to enable you to store some behaviour globally (the content of the ping method).
`
Here’s my rough interpretation of your code into more idiomatic Sonic Pi. Is the behaviour of this in any way similar to the kind of thing you’re trying to achieve? :slight_smile:

set :bpm, 90

define :ping do
  sample :elec_ping, amp: 5
end

live_loop :metronome2 do
  use_bpm get[:bpm]
  ping
  sleep 1
end

Most definitely, in this case, @samaaron. This was just me seeing what limitations if any are in place for this type of stuff.

I can imagine writing several helpers/classes around slicing samples and wiring up OSC events. I don’t see anything I couldn’t do with define: though, so don’t take any of my ruby hacks as needed changes or anything :slight_smile: Mostly just exploring at the moment!

1 Like

Have fun! If you do stumble upon anything that seems at all useful/interesting do let me know!

1 Like