How to setup hardware and software for a fun first live coding gig?

Hi! Hello from Australia! I really love Sonic Pi but our live coding community is very small and dispersed so I have really been ruminating on these questions, any help is greatly appreciated! :sunflower:

I’ve been diving deep into the forum to find some answers but I’m having a little trouble so thought I’d post here :slight_smile: feel free to reply with links to other posts if I’ve missed them!

Context: I have my first live coding performance coming up in a little less than a week (which is very exciting). I am a vocalist and will be doing an improv set with Sonic Pi to accompany me, with my vocals running through :live_audio. I will be performing genres spanning ambient, experimental and RnB. I’ve been using Sonic Pi for a couple of years now but my technical coding knowledge is not super deep! I did one foundational subject in uni so I’d love if you could please explain any code responses if possible :sweat:

Exact setup/what i hope to achieve:

*my laptop is a macbook pro M1, 16GB ram
*I have a Focus Scarlet 8i6 interface which will connect my laptop to my microphone (Shure Sm58) and midi keyboard (Arturia minilab mkII)
*my vocals will be processed by Sonic Pi with fx’s on the :live_audio (this works fine BUT would love any tips here in transitioning songs and :live_audio not turning off, and additionally avoiding feedback in a live setting)
*sonic pi will be receiving midi messages from my keyboard (this works well)
*connection to a projector in the venue, q: how do people play visuals so seemlessly behind the program without seeing the taskbar/youtube video? I know how to make SP transparent but yes if I fullscreen I havent worked out how to fullscreen the youtube video graphics behind it and at the same time :frowning:
*the venue has a sound desk for mixing, speakers and foldback speakers for me to hear myself

  • **my main question/worry is: how do I send the audio from each loop to the sound desk to mix OR to a hardware mixer to mix myself? When live coding at events/algoraves how do people connect to the sound at the venue? is there an acceptance of mixing as you are coding via amp parameters for e.g.? ** (Although I am worried mixing myself may be difficult as the audience experience will be different to my sound at the foldback speakers…)

Potential solutions I’ve been made aware of from in_thread or have been thinking about (but may not work too effectively):

  • using sound_out to something like TouchOSC to program my own buttons/faders for amp and cutoff - awesome! but how will the sound engineer have access to this? Is there potentially an easier tool (time permitting) with simpler code perhaps?

  • could I use a hardware mixer with nobs and aux channels e.g. 10 channels, connected to my scarlet interface and sending audio from sonic Pi with sound_out? if so, amazingggg, how would I specify to sonic pi the paths/names and which channels to send sound to? (something like below)

  • I also use Ableton and have sent out midi messages to Ableton synths using the IAC driver, is there a way to use sound_out from SPi to an Ableton audio channel perhaps? then I feel it could be really easy to map my midi controller to parameters and mix quickly/easily as I code and perform…

bonus question/enquiry: would really love how people set up their live performances to make them as easeful/seemless as possible, and any other tips for mixing or logistics of performing a live coding set

2 Likes

Hi - thanks so much for this thread! All best for your gig and let’s know how you get on :dancer:

I’m trying to piece my kit together with Sonic Pi and learn the ropes too. Hope we can share notes and learning.

Up to now, I use a Boss looper (it has midi) with acoustic input, and video played out via HDMI. The video is MP4s in a RaspPi controlled by footswitch. I’m interested in more generative options :sweat_smile: , have been playing with Processing.

I love that Sonic Pi lets you see a relationship between music and numbers. Anything that helps to demystify tech is empowering! I’ve really enjoyed using it and have now used it with Circuit Tracks (midi controller/sequencer). This has mixer knobs but haven’t got them working with SP yet, early days. If anyone has mapping tips, please share!

Projection - it sounds like your venue is well kitted. They should be able to take your mac hdmi feed to their screen. Ideally they’ll have a video mixer too.

Here’s a DIY lofi version, in case useful for small halls:
If there’s no screen, a wall + school projector works. To avoid projection onto yourself + shadows, use short-throw projector from the front, or back projection onto transparent material.

It sounds like Ableton is your answer for mixer. IMO the more you can control at your end, the better. Even with only 2 sends - vocal mic and looper out - it has been a venue challenge in a short prep time.

I’m aiming to use a MIDI controller with knob or fader mixing, ideally mapped straight to SPi. Don’t know how possible this is.

It would be great to gather people’s experiences using MIDI controllers with SPi. Circuit Tracks has been fun to connect and I’m sure it can do more with SPi. Interested to know your Arturia minilab experience.

It feels like a huge exciting jigsaw in several dimensions at the moment, trying to piece things together from scattered sources. Thank you for diving in! Look forward to hearing about your first live coding gig. :sunglasses:

1 Like

Thank you for this response! It’s so great learning other peoples’ setup! Yeah I reckon Ableton might be the one for now until I work out how to map knobs straight from my midi controller to SPi!

Thanks again! ahhhh hopefully it goes well! :smiley:

1 Like