V5 Tech Preview 1 - Introducing Hydra Support

As a token of my heartfelt gratitude to everyone supporting me on Patreon, I thought it was be nice to give you a sneak preview of the fun stuff I’ve been working on recently. It’s still very early in its development, but it’s already exciting and expressive enough to share so you all can start playing with it.

IMG_6187

https://www.patreon.com/posts/v5-tech-preview-78420348

(Just as a warning - this is a very early experimental prototype. So please expect bugs, poor documentation and things to be generally rough and unfinished. That’s the point - I wanted to get it into your hands so you have the opportunity to provide feedback and help improve the design and implementation of the final release which I’m hoping will be some time in the summer.)

Tau - a web-powered GUI component for fast development

I have been working towards a new web-powered interface for Sonic Pi for a long time now. This initially started with the integration of the Elixir programming language in v4 which brought with it a web framework called Phoenix and a powerful live interface technology called LiveView. The main idea is to use this new web technology to develop new interface ideas and components. Given Elixir LiveView’s very fast development workflow - it will be much faster to develop and extend the interface than it is with the existing C++ GUI toolkit (Qt).

The other side of Tau is an integrated web view build directly into the browser. This new view is built with the same tech as Chrome and is programmed to directly connect and work with the LiveView backend described above.

It’s important to point out that none of this requires a connection to the web. Everything is built directly into Sonic Pi. Once you have downloaded and installed Sonic Pi you’ll have everything you need to live code an audio visual set - no internet required.

I have many, many things planned for Tau which I’ll continue working towards and sharing with you as they become a reality. However, one important long-term goal is to use it for visualisations. Back when I worked on Overtone (the predecessor of Sonic Pi) I was performing with visuals from a combination of OpenGL shaders (a simple clone of Shadertoy) and Processing (using a library I was working on called Quil). It was amazingly fun and has long been something I wanted to see inside Sonic Pi.

Today, v5 Tech Preview 1 introduces the first step towards that vision of powerful and expressive visual tools integrated deep within Sonic Pi.

As a “hello world” example to get things started I have integrated the wonderful Hydra visual synth into Tau. Hydra was created by Olivia Jack and has a vibrant and active community of visual artists sharing and discussing their work. You can start experimenting and playing with Hydra here:

http://hydra.ojack.xyz

The plan is for the vast majority of Hydra sketches to be simple to copy and paste into Sonic Pi and for them to “just work”. This is achievable with the new function hydra which takes a single string representing the code for the new Hydra sketch. For example, you can start with something very simple:

hydra "osc().out()"

Wait, what is that osc() and out(), what do they mean, how does it work, what else is available? All great questions and for now there’s no real support in Sonic Pi to help with any of this - all that will come - including ways to trigger and manipulate complex sketches with the same simplicity as triggering a synth or sample.

However, for now, I would definitely recommend you play around with what’s possible. Get a feel for it and see what you can do. A great starting point is the official Hydra API:

Note that you can call hydra anywhere you would call synth or sample. This means you can put calls inside a live_loop and live swap the visuals in time with your music.

A Lovely Example to Try

Try running this sketch by Flor de Fuego which is one of the many wonderful examples over on https://hydra.ojack.xyz

#licensed with CC BY-NC-SA 4.0 https://creativecommons.org/licenses/by-nc-sa/4.0/
#Glitch River
#Flor de Fuego
#https://flordefuego.github.io/

hydra "voronoi(8,1)
  .mult(osc(10,0.1,()=>Math.sin(time)*3).saturate(3).kaleid(200))
  .modulate(o0,0.5)
  .add(o0,0.8)
  .scrollY(-0.01)
  .scale(0.99)
  .modulate(voronoi(8,1),0.008)
  .luma(0.3)
  .out();
  speed = 0.1"

Where is Tau?

Excellent question! Right now, you have to open the help system (hit the big help button) and click on the Tau tab. This is just a temporary home - I’ll figure out where it should go before releasing. Note that there are a few handy controls - R refreshes the Tau window and is useful if things misbehave and you want to reset. E opens Tau in your default browser. You can have as many connections as you want and when you call hydra the running Hydra sketch in all browsers will update simultaneously. + and - change the zoom level.

What’s missing?

Lots and lots and lots and lots. I’m only just getting started and I expect to make a few tech previews as I explore various ideas. For example, if your Hydra syntax isn’t correct you don’t currently see the error, you can’t (yet) access the Sonic Pi audio data to drive the visuals and you can’t (yet) easily stream new control values to a running sketch like you can with a running synth. For example, mapping MIDI controls or multiple live_loops to the running visualisation. This will all be possible.

What’s the mysterious spinning square in the top left?

That’s a proof-of concept p5.js sketch running happily along. You can’t modify or live code it yet, but you will be able to soon. I just wanted to make sure both Hydra and Processing could co-exist in the same view.

Discuss and Share on Discord

Please do discuss and share your ideas on the Sonic Pi discord server. I’d love to see what you come up with. There’s also a Hydra discord to join and learn from those with considerable experience.

What else?

That’s it for now. I have so much planned - but I’ll leave discussing and sharing those plans with you for another day. For now, learn some Hydra, play around with the new functionality. Tell me what’s missing. What do you want to see next?

I hope that you’re as excited about Tau as I am!

Patreon supporters can download v5 Tech Preview 1 here: https://www.patreon.com/posts/v5-tech-preview-78420348

12 Likes

It would be nifty if the whole package would simply
run in the browser ( as Hydra does now )

  1. Only 1 version
  2. No downloads or installs
  3. Sonic Pi with Hydra presented
    as is :
    Klangmeister

It would indeed be nifty. However, it would require a complete rewrite of all of the core components. For now, the web part of Sonic Pi is as an extra UI component. The sound is via the powerful SuperCollider server, the IO/web-server is Erlang/Elixir and the network metronome is Ableton Link. None of these have an equivalent 1-1 mapping in web-tech.

1 Like

OMG this is so funny. I had just completed a compile from git (so 5.x Tech Preview), and found myself leafing through the help system to brush up on a couple of things. I hadn’t read this message, so I suddenly realised there was a “Tau” tab next to the “Docs” tab, so I clicked on it. LOL. Scared the living sh*it out of me for a second there. XD.

I had seen some work being done on visualisers and ways to use SP for procedural image creation, but this caught me completely by surprise. I continued poking around, clicked here and there, and hello, that dynamic “test screen”-like thing was running in my browser.

All power to Tau, all power to the BEAM.

So now I’m messing with the code snippet above and reading up on Hydra. This is going to be good, really good.

Thanks a lot to @samaaron and everybody on the team and in the community.

3 Likes

This looks wonderful, v exciting.

This is sweet.
I think I built on windows 11, using a windows tiny image. Whilst my webcam works on win 10 it doesn’t seem to work on win 11.

I copied the build folder over to try on my win10 main OS, but tau fails to load on boot. Does the build process build for current os, or is it possible it could & should work on 10?

Or did I need to copy more than the build folder?

Also with hydra jack (online ) the src can be a screen or a cam; does this already work in sp hydra?!

You can’t use the webcam (or OSC or midi) with the experimental built in hydra on the tech5 preview version. It is an experimental set up and doesn’t have the necessary support code to allow these. It does however give you the ability to alter the hydra code (which is eseentailly parsed as a text string) by altering its contents using variables from Sonic PI and in this was to produce a display which can be synced with Sonic Pi and controlled and altered by it. It is not a released version but only availalbe to Patreon supporters.
You can however use available libraries to interact with the web based version of hydra ( on hydra.ojack.xyz) both using OSC and MIDI and affect the display, again in a way which can be synced to Sonic PI. This version also supports linkage to a web camera using hooks built in to Chrome Browser for example. NB this is NOT the same as the external Browser window that Sonic PI tech5 can produce, which merely mirrors the internal Sonic Pi hydra display, but is connected via a local host port.

1 Like

Thanks for this, I’ve had a look at various OSC stuff , theres the hydra standalone repo which has some examples. Ive seen one which mentions using node in atom, I’ll try and setup vscode with node and try to connect to multi devices.

I got distracted by that glitch.me site that naoto the hydra jedi uses

Not sure yet if i can simply use osc messages from sp to hydra.ojack.xyz - from the examples it seems fully operational, with support for all the extra libraries (via the jigsaw piece icon)

Interested in the network streaming stuff, so multiple displays (pb’s) are updated , or even from multiple sources

Looking on github for the hydra repo that had “example 8 osc js”

I found this

Lots to play with, and learn!

That ojack/hdra-osc is what I use to communicate with the web version of hyudra. It works well and I have published various videos using it.
Of course midi is much easier to interface, but I think OSC give greater fexlibility.

see here