Sonic Pi drives Visor

Been a bit quiet lately. First away on holiday, but during the last few days concentrating on getting to grips with Visor.Live which is a great system using Ruby to give a nice interface to processing, to allow live coding of visuals. It works well with Sonic Pi, and starting from a demo video by the author @jackvpurvis I produced a display controlled by midi signals from Sonic Pi. (I wanted to incorporate some 3D objects, so I had to do a detour to learn how to produce .obj files using Blender:— Sonic Pi leads you into all sorts of stuff :slight_smile:)
Here is a youtube video of the piece. Not quite best resolution but gives a pretty good idea.


Hi @Robin,

thanks for that! Will you post some information about what you did? I’d be interested and would be grateful for some introductory help.

Hi Martin
I found Visor quite a steep learning curve to start with, but once you make some progress it gets easier.
Version 0.4.0 (the latest) is the one I am using. It is initially confusing. Try loading in some of the samples (initially the ones not requiring midi support) to get the hang of the basic operation. I then found it very useful to work through the three demo videos on working along with my own installed version. Haven’t tried the audio input on linux but have used both soundflower and amoeba’s loopback to connect on a Mac. Initially I used Spotify as a sound source, but once I had a working system I fed in audio from Sonic Pi. I used TouchOSC as a midi input device (using sliders and rotaries). The video used LaunchControlXL as the source device but you can emulate this using TouchOSC. You need to use a .json file to map the midi cc numbers to the names you want to use in Visor. £d object files are also fun. (I had to learn some blender to create one!)
You can use processing commands direct in many cases.
Here is a copy of a flashing star image you can try. Needs audio to work and you mus start the tap tempo (ctrl+enter)

Please ask abut any other details you’ld like.

1 Like

Hi @robin.newman,

I did some initial testing but I found that Visor will eat too much of my processing power (which I rather reserve for music). So either I’ll need something lighter or - in the long run - I will need more power. Another solution is of course something running on its own hardware.

Visualisation is an issue for me because I have a gig coming up where it would be nice to have visually something more going on as just typed text :wink:

Amazing stuff, Robin. I’m still looking for a solution to do visualization with SP that would be accessible by my 8th graders, and Visor might be the ticket.

If I understand correctly, you’re using OSC to output MIDI? I’m a noob with digital music, so just trying to wrap my head around exactly how to do it…

Hi Bob

No I’m only communicating with Visor via midi control signals (midi_cc commands). Basically you set up a .json text file (there are some examples supplied with Visor) which maps the midi_cc numbvers you want to use to names eg ka1, ka2, ka3 etc. These are then used rather like named variables which supply the corresponding value (sent in range 0->127) but translated by Visor to appear as a float in the range 0.0 to 1.0
I also use the sound input from Sonic Pi (which you can link via soundflower or loopback on a mac) which is fed to the Visor audio input. Visor automatically applise a fft (Fast Fourier Transform) to this and also determines a volume reading which you can use as a scaling factor.
I did also use TouchOSC to control Visor, as this can send midi signals (with the aid of a OSC-midi bridge) as well as OSC.
I have talked with the author of Visor and suggested that it would be good to add OSC support to Visor.

1 Like

Thanks for the reply. Visor is really intriguing so I’ve added it to my ever-growing list of summer projects. Which currently has about 10 audio-visual software platforms to investigate…

Here’s another piece I’ve just done. It’s quite hard to document as there are several pieces to the puzzle, but I’ll give it a go.

The Sonic Pi file is below:

#Sonic Pi sample groove with Visor visuals by Robin Newman, April 2019

use_midi_defaults port: "iac_driver_sonicpi",channel: 1
set :kill,false
at 180 do
  set :kill,true

with_fx :gverb, room: 20, mix: 0.8 do
  live_loop :test do
    tick(:slow) if tick%13 == 0
    puts tick(:slow)%13 /13.0
    use_bpm (ring 90,120,40,150).look(:slow)
    midi_cc 13,(tick(:slow)%13/13.0 * 127).to_i
    f=(ring 5,7,8).look(:slow)
    l=(ring 8,11,13).look(:slow)
    puts f,l
    sample s,beat_stretch: 0.5,start: 0.1,finish: 0.6,amp: 2,pan: (-1)**dice(2) if (spread f,l).look
    if !(spread f,l).look
      sample (ring :bd_haus, :bd_mehackit).tick(:dr),pan: (-1)**dice(2)
    sleep 0.25
    stop if get(:kill)

The Visor project file is below:

# ===== Default : Default
#set up alternative two channel audio input with fft processing
#this code from Jack Purvis
if @input == nil
  @minim = AudioManager.instance.instance_variable_get(:@minim)
  AudioManager.instance.instance_variable_set(:@input, nil)
  @minim =
  device = 'Soundflower (2ch)' #input device selector
  device = Minim::AudioSystem.get_mixer_info.find_index { |mixer| == device }

  return if device.nil?

  mixer = Minim::AudioSystem.get_mixer(Minim::AudioSystem.get_mixer_info[device])
  @input = @minim.get_line_in
  @fft_left =, @input.sample_rate)
  @fft_left.log_averages(22, 3)
  @fft_right =, @input.sample_rate)
  @fft_right.log_averages(22, 3)

colorMode HSB,100,100,100

def rose(del_theta,amplitude,k)
  0.step(TWO_PI,del_theta) do |theta|
    x = amplitude*cos(k*theta)*cos(theta)
    y = amplitude*cos(k*theta)*sin(theta)
    vertex x,y

def polygon (x, y, radius1, radius2, npoints)
  angle = TWO_PI / npoints
  halfAngle = angle/2.0
  0.step(TWO_PI,angle) do |a|
    sx = x + cos(a) * radius2
    sy = y + sin(a) * radius2
    vertex(sx, sy)
    sx = x + cos(a+halfAngle) * radius1
    sy = y + sin(a+halfAngle) * radius1
def draw
  afactor=100 #allows for lower output from soundflower compared to loopback
  #puts afactor*@input.mix.level
  puts ka1 #check midi input n console
  background 0
  #draw left and right rose petal images
  fill ka1 * 100,100,100
  push_matrix #matrix state
  translate  width / 2 - 400,0.9*height - (height * ka1)*0.9
  rotate_at frameCount*0.1,0,0
  fill 100,ka1 * 100,100
  translate  width / 2 + 400,0.9*height - (height * ka1)*0.9
  rotate_at frameCount*0.1,0,0
  pop_matrix #restore matrix state
  #draw central polygon image
  translate width/2,height/2
  fill rand(100),100,100,100
# ===== test : Default
def draw
  background 0

  translate width * 0.5, height * 0.5
  rotate radians(frame_count)

  rect_mode CENTER

  rect 0, 0, 200, 50

The midi mapping .json file is next (goes in the midi folder in Visor)

  "note_mappings": { },

  "control_mappings": {
    "ka1": 13}

I called it singleMidiCC.json as it just maps a single midi control channel 13 to the variable ka1

I also used soundflower (ch2) as the loopback device to feed audio into Visor which is documented at various placs on the internet. I usually use the commercial loopback utility for this on my mac, but soundflower is free. I had to scale the output using the variable afactor when using soundflower as I found that the audio signal was much smaller. (afactor = 1 when using loopback)

Visor Live has built in SINGLE channel audio with fft processing. IN this example that is bypassed and the processing minim library is used to set up a stereo input with separate fft processing on each. I asked Jack Purvis, the creator of Visor if this was possible and he sent this code by return of email!

I hope that this may give you some ideas to get going. The sketch uses two drawing functions. One draws a rose petal, the other draws polygons.

1 Like

Thanks so much for sharing!

Exciting that Jack Purvis sent you that code, too!