Introduction - Jimmy Gunawan Blender Sushi Guy

Hi all, my name is Jimmy, I live in Sydney, Australia. I am currently an indie artist.

Known online as Blender Sushi Guy, and I am regularly doing LIVE NODING videos using Blender and procedural add-ons. It’s not yet like Sam LIVE CODING, however, one day it will all be realtime!

I accidentally stumbled into Sonic Pi, maybe by fate. Originally was learning Musical programming using ChucK, from Ge Wang course. Sonic Pi with Ruby language however, is a lot simpler and powerful. I do not have programming background, so pretty much self taught. Currently kind of okey with Python and Swift.

I use to play organ for 6 years back then, but it’s only that. These days, I am exploring music and sonic creation using iPad. Playgrounds app is in particular awesome.

Currently I am very interested to use Sonic Pi with Blender 3D open source to generate some kind of procedural animation and sound in realtime. After on and off using Sonic Pi, I think this year with Sonic Pi 3, things have changed, somewhat Sonic Pi clicks more with my brain and the OSC also MIDI thing opens up a lot of possibilities.

So, my passion is in procedural generation of sound, visualization and interactivity.

I gotta thank Sam Aaron and some Sonic Pi artists out there for creating videos that help with my exploration in codes and music.

3 Likes

Sounds like there’s a couple of us who’ve gone the ChucK => SPi route. :smiley:
Which of Ge’s tutorials have you used? Went through Ajay Kapur’s Kadenze stuff and couldn’t help but get critical of the method involved. But Ge’s energy tends to be contagious, so maybe that one was better.
Would be curious about your Blender work. To me, it all feels more difficult than music/audio stuff.

1 Like

Hi Alex,

Yes, I was following this free Coursera online course back then, also purchased his ebook on Music Programming. I probably understand 30-35% ChucK back then, might revisit again. But SPi give me a boost of confident in coding music.

I did 1-2 years on and off SPi and also using Python-Sonic a little bit. I think I am kind of intermediate coder artist now (Python, a bit of Processing.)

I am sure there is some links between Code, Visual and Music. And NODES!

With Blender 3D, I have been experimenting with Procedural Frameworks, NODES and stuffs (Animation Nodes, Sverchok). There is a missing part with Blender Nodes: Synth and Samples.

You might have heard a big software called Houdini? Now currently Blender and Nodes situation is like a baby Houdini. Houdini can do Midi and Synth, etc generating music using nodes. Blender… well Blender probably can using Sonic Pi :slight_smile:

So that is the plan.

I was excited when heard Sonic Pi now has MIDI and OSC capability. So that can be used to give “realtime” control.

Blender does not work in realtime, unless we are using Blender Game Engine / BGE. However, Blender normal 3D environment seems to be getting faster and faster, which means realtime becoming possible eventually.

I am still figuring things out how is the best way to make Blender to talk to Sonic Pi to generate music and synth in the most fun and robust kind of way.

Sonic Pi seems to be able to THREAD and accept live loop input output. While with Blender, normally it is like a frame by frame executable code.

Ideally, I want to see things like this:

  • Blender physics simulation to trigger sounds in Sonic Pi
  • Sonic Pi to generate and control 3D objects, like visualization and more.
  • MIDI >> Sonic Pi >> Blender?
  • etc

If not Blender, maybe I will try using Unity or make Apple Swift app at some point. But let see if that is possible.

Endless possibilities!

For now, I will try to exercise some Ruby and Sonic Pi, and see maybe I could generate some kind of realtime 3D visualizations.

@enzyme69 is there any chance you could explain a little bit more about what NODES are?

1 Like

Hi Sam, sorry late reply.

Hmm what is NODES?

From my own understanding, Nodes will be sort of like a container or a tool box, it usually has inputs and outputs, and a node can do simple function, simple process, or algorithm. Some nodes can generate things, some can process, some can separate data value, etc. We can hook nodes together to make a more complex setup that solve problems.

During my computer graphics lifetime, I happened to encounter a lot of software with this Nodes:

  • Maya 3D
  • Houdini 3D
  • NodeBox
  • Blender Compositing
  • Blender Cycles Material creation
  • And … finally Blender with add-ons like Sverchok and Animation Nodes.

This “nodes” frameworks somewhat click with my brain. Scratch actually is a bit like nodes framework, but more blocky.

Anyhow, imagine if SonicPi is a node, then perhaps we will have:

  • Synth Node
  • Sample Node

Let say withing a Synth Node, we can go inside and start adding inputs like Frequency, etc. That is one way to think of it.

With Blender and Python Sonic Module, inside this frameworks I am using (Animation Nodes), we can sort of wrap some Python codes that trigger Sonic Pi. However, in Blender, currently the way I am doing things, is that I am updating code and sending message, per frame update. So it is probably not as swiftly as how Ruby run inside Sonic Pi. However, Blender is actually heading toward realtime and also Blender Game Engine can do OSC. So dynamic and realtime triggering can be pretty cool.

I have not on that level yet. Still… inside Blender, one can visually draw lines or grease pencil skethches, resampling the lines as points, and then passing the points X and Y into Sonic Pi for triggering values.

Nodes vs Codes:
Nodes is faster for prototyping and great to visually present logics, but there are things where Nodes will do it much more efficient and faster. Nodes tends to get slower and it grows and become complex.

Back to SonicPi…
I am currently re-reading the SonicPi documentation and tutorials over and over again, and actually learning some interesting things in relation to programming.

I found Sonic Pi to be very dynamic and organic, especially with Live Loop. It’s almost like real time game, and you are spawnings lots of musical objects. I visual Sonic Pi like that in my brain.

Sonic Pi and LiveLoop concept is mind blowing for me. This dynamic interaction to generate sound is something I want to do.

Don’t know much about 3D animation or, indeed, any kind of visual work. So it’s interesting to hear what others are doing in that sphere. Do you have some examples?
Reminds me a bit of visual programming languages like Max and TouchDesigner. Not sure if those are in the same family as Houdini 3D and NodeBox, but could perceive some connections. Max was originally designed to work with MIDI, then audio. It’s now a full-fledged environment to work with visuals as well.
TouchDesigner does a few things with sound. Haven’t explored it that fully, but a friend is teaching it in college as it’s very easy to make prototypes (and it’s free for non-commercial use).

Same here. It’s no exaggeration to say it’s a game changer. Maybe something like it exists in other languages meant for live performance, but it was my first encounter and it had a strong effect on me. We quickly take it for granted, after working in Sonic Pi for a while. But it’s really another way to think of loops.

1 Like

A new concept from Sonic Pi that I grasped is multi threadings and live loop.

I watched this fun video earlier today:

Ideally I want something like Sonic Pi to easily IO and generate sounds to control or to sync with animation or other controllers.

It is getting there, actually. Blender Python and some modules like MIDO and Python Sonic can help with some of the tasks.

Ideally, I should be using something like Unity or Blender Game Engine, if aiming for realtime. But Blender normal environment might work when music sync is baked. Because Blender works with keyframes.

Yes, you are right, I heard about Max and TouchDesigner, seems too high end for me, since I have no such budget.

In my YouTube channel, I occasionally play around with MIDI and OSC. Still pretty basic, but Sonic Pi really helps me in this area.

https://www.youtube.com/watch?v=6dBNPLdfTfM

Let see how it goes :slight_smile:

Imagine in near future, we can use Augmented Reality that listens to Sonic Pi, or control Sonic Pi using AR!