[RFC] Block for live coding at runtime

Hey there,

Live-coding is gaining traction in the cross-section of art and technology. It’s useful in performance (where I use it) or for quick prototyping (e.g. the interest in the node-based ui thread (1)).

To this end, I’ve created oschader, a project that uses controls Cinder during runtime using osc. It defines Programs that operate on fbos and take a number of inputs (textures, floats, etc). The key feature is the ability to, much like a node-based-ui, define a base layer and then apply a number of effects to it. Here’s a video that might help you visualize what’s going on.

Here’s where the RFC comes in: what would you want from a cinder block that lets you experiment with shaders at runtime? This will be my first Cinder block, so I’m not sure how to go about choosing what to leave in for completeness and what to leave out for simplicity. A few specific questions spring to mind.

  1. Right now it uses osc messages as a control mechanism (sent by oscillare (2), a haskell-based control system). Should it include the osc messaging system in the block itself or leave the user to implement their own controller?

  2. Since I’ve mainly used this to complement live audio performance, it has microphone, camera, and image file inputs. Which of these could be included, and which left to the user to include? Ideally whichever way it’s done, there would be an input state that could be extended and modified by the user.

  3. Will this be useful as a cinder block? Should it be two blocks instead? Or is it so specific that it’s not really generally useful?

Thanks for taking the time to read this, I look forward to hearing suggestions, criticisms, or anything else.

3 Likes

It didn’t seem to let me put more than 2 links per post, so here are the other two links.

  1. Node-based ui thread in old forums: https://forum.libcinder.org/topic/node-based-ui
  2. Oscillare, a controller for live-coding visuals with oschader: https://github.com/ulyssesp/oscillare

Having played with it, I would like a block and I can help you if you need…

  1. The osc control mechanism could be put as a sample in the block?

  2. I tryed in my videodrömm block to have all sort of textures as input textures for my shaders(different classes for image, audio, video, shared, capture), it might be useful

  3. It would be useful for me

About the control mechanism, supporting websockets would be nice as we could use a html based ui like the_force or threenodes for live-coding and tweaking the uniforms.

Cool idea. While not exactly the same, I was thinking about Cinder when I came across the Runtime Compiled C++ project. Andrew Bell said that basically build times are the single biggest issue to using C++ as the underlying language for a creative coding framework. So I wonder if this runtime compiled approach could be made to work with Cinder?

Thanks for this cool idea too @ulyssesp, I really enjoy using nodal systems in apps like Nuke for instance. :slight_smile:

@hurpyderpy: There is someone in the Cinder community making good progress on this. :slight_smile:

1 Like

Awesome! I’ve played around with some of the samples contained in that RCC++ repo and they build and run just fine on my current Linux Mint box.

Also, could someone be so kind as to migrate that thread into the new discourse system for us? That would be appreciated. :slight_smile:

One other thing I forgot to mention, the original article that introduced me to the topic of Runtime-Compiled C++ use in gamedev is freely available online, as well as many other articles from original Game AI Pro book.

Oh awesome! Wasn’t aware of the runtime C++ work that Simon was doing. I’ll have to take a look and see what goodness can be gained from that :stuck_out_tongue:

The main benefit of this potential block is that it’s an abstract system that can be built out according to the programmer’s needs as much as necessary, but changed quickly and from a single entry point. In fact, it would play quite nicely with Simon’s work on the runtime C++ because you could target this single entry point straight in C++. Would it be more helpful to gear this around working smoothly with that library then? Is that something people are interested in?

@ulyssesp : I guess what I would look for when it comes to “live coding” is not so much a new language or script system, but the ability to use a language that I already know, such as C++. With that in mind, I think I would rather choose to use Simon’s block. Perhaps in combination with a “live assets” system that detects when I save a file after editing it and then just reloads that file. I already have a system of my own that handles live config files, textures and shaders. The shaders in particular are really nice, because I can just tinker with the code while the app is running and directly see the results. The only downside being that I need to check the logs for errors. That would be my honest answer. But don’t let that keep you from coming up with your own preferred solution!

Hi.

I could use this in my Cinder Augmented Theatre App , which is OSC controlled. It would be super cool to be able to send the shaders as osc messages to the iOS devices.

  1. Having an example of how to do that in case of oscshader would flatten the learning curve, but then I would switch to adapting my own existing OSC controller.

  2. I have no strong opinion about this.

  3. It would be useful as a cinder block

  4. iOS and Android targets would be useful too

–8