Playing a song using trumpets

Hello, Sonic Pi community.

I’m am an experienced programmer (Python) but I know very little about music.

I found out about Sonic Pi through Dylan Beattie’s Art of Code talk. After, I transcribed one of my favorite songs into Sonic Pi, and it sounded awesome. I then experimented with some of the different built in synths.

Now I’m looking to go further. I would like add the trumpet to Sonic Pi such that I can play songs using it. How do I go about doing that?



It’s pretty much plug-and-play. All you need is Sonic-Pi, a MIDI cable (software or hardware), and an instrument (software or hardware). I will show you the software route, but in principle it’s the same if you had the necessary hardware.

Firstly, do you have a standalone virtual trumpet? If not, you can use a free SoundFont player like Sforzando (link to download page) and find a free trumpet SoundFont from VSCO community edition to start with.

Once you have a standalone instrument up and running, you’ll need to route your MIDI from Sonic-Pi to that instrument. To set this up, you can use a free virtual MIDI cable such as loopMIDI. Just create a virtual cable, locate it amongst the MIDI outputs of Sonic-Pi, and route it to the MIDI input of your standalone instrument.

From there, your instrument should play back the notes you send to it.

I personally work in a DAW, so I usually route my MIDI into VST plugin instruments that run within my DAW. Let me know if you need any help getting into the world of music production!


A virtual MIDI cable set up in loopMIDI:


The MIDI outputs located in Sonic-Pi:


The MIDI inputs page in my standalone instrument application (Pianoteq):


Here’s the code to play your first note (insert your own port name if it is different):

midi :c4, vel: 64, sustain: 1, port: "loopmidi_port_2", channel: 1


P.S. If you’d like better sounds (unless you want your music to sound like an old RPG), you can consider getting this other free and popular software sampler instrument, called Kontakt Player. If you get the Komplete Start music software bundle (also free), you’ll be armed with the necessary plugins to begin music production in a DAW.

I am passionate about music production and would gladly write a tutorial for those looking to incorporate Sonic-Pi in their music :heart:


Thank you for the assistance, d0lfyn. I was able to accomplish what I wanted.

I was using play_pattern_timed to play the song, so after connecting to Sforzando as you described, I created a custom method.

define :play_pattern_timed_midi do |notes, times, **args|
  if is_list_like?(times)
    t = times.ring
    notes.each_with_index do |note, idx|
      kwargs = if args.last.is_a?(Hash) then args.last else {} end
      duration = t[idx]
      kwargs[:duration] = duration
      midi(note, *[kwargs])
      #play(note, *[kwargs])
    play_pattern_timed_midi(notes, [times], *args)

I encountered an error though: undefined method `last’ for {}:Hash
I’m not sure why I got this error. Maybe the version of Sonic Pi I’m using?
In any case, I replaced that line with

  kwargs = if args.values.last.is_a?(Hash) then args.values.last else {} end

After that, the song played, but it didn’t sound good. I realized that when the duration is passed into play, the sustain is affected. This, however, does not happen when duration is passed into midi.

So I replaced the midi line with

  midi note, sustain: duration

The song still didn’t sound quite right though. Next, I learned that for play the sustain and release of a note are affected by the bpm. This might have been obvious to most, but as I said, I’m new to music. For that, I added the following

multiplier = 60 / current_bpm
midi note, sustain: duration * multiplier

I’m not sure if the release is being altered as well, but now the song sounds like it should.

1 Like

You’re welcome @rahelios! Congrats on your success! :smiley:

To address the undefined method error, a bit of Googling tells me that the double-splat parameter is in fact a hash, for key-value args, as the error message indicates. I believe that last should work as intended if you use the single-splat operator instead (i.e. *args), which gets you the intended args array.

The Sonic-Pi documentation explains what you have discovered, that the duration of a note in beats is specified by the sustain opt, and not duration. Your multiplier is in units of seconds/beat (since you divided 60 seconds/minute by your beats/minute), and when multiplied by the duration (in beats), it yields a sustain time in seconds. This conversion seems strange to me, given that the sustain opt is specified in beats, not seconds.

And regarding the release time, it should be noted that MIDI signals do not communicate data about the ADSR envelope, which is configured within the instrument itself. The MIDI itself consists of simple on and off signals. This diagram illustrates what the connection is between the MIDI ON signal (green), MIDI OFF signal (red), and the instrument’s ADSR envelope setting (blue).

If you need to compensate for the release time of an instrument, you can reduce the time of your MIDI signal. You could calculate this compensation while experimenting with the release time of your instrument so as to achieve a relatively natural or synthetic sound.

Keep at it! :+1: