"boost/Signals2.hpp" file not found in Block


Ive just pulled the latest from the master branch and rebuilt (with ./fullbuild.sh on osx still as I havent got to grips with cmake yet).

Most existing projects build fine but some that use specific cinder blocks are falling over when including “boost/Signals2.hpp”. Are there some breaking changes in the latest android_linux -> master merge with regards to blocks and boost? It would be great to know how to update old blocks (if thats what is required) as some really useful ones are no longer under active development. So far this has failed with the MIDI2 block and PoScene.

Could it be something to do with using the now deprecated fullbuild.sh script instead of cmake to build cinder? It feels like i’m missing something trivial. It also feels like ive had a similar issue before but couldnt find a solution in the old forums.

Any help appreciated.


Ok seems like I missed the quite basic detail that cinder has its own version of Signal to remove this boost dependancy (which seems to have been stripped out fully now).

Seems like I could replace the boost::signals requirement in this block with cinder::Signals and ‘hope’ that thread safety isnt an issue (in the MIDI2 block for example this might be questionable). Or re-include a system version of boost instead to keep the block alive?

Again, any further input and advice welcome as dependancy management isnt my strong suit.



a quick look at Cinder’s source code tells me that the Signals classes are not inherently thread-safe. Be careful when connecting Signals to Slots on different threads. But remember you can always call ci::app::App::dispatchAsync(...) from a background thread to run code on the main thread. See also this post on the old forums.



I use and maintain the MIDI2 block nowadays (formerly maintained by Martin Blasko ), I can try to help or accept a PR!


Nice Bruce!

great to see some other cinder midi users. Will check it out and see how it plays with everything else.



A note on using ci::signals in a multi-threaded situation. To me, signals are a great solution in UI code, but in multi-threaded situations, especially in real-time sensitive situations such as audio and midi, they provide many shortcomings. For one, you need to be clear what thread you’re on in a callback especially if it isn’t the UI thread, since most graphics calls will cause undefined behavior (crash). Secondly, calling out from secondary real-time threads makes it difficult to ensure that everything will get processed fast enough to avoid glitches (in audio this would be an overrun, MIDI you would just be out of sync).

But I think there are generally two cases you’d want to cover: callbacks occurring on the main thread that are meant to affect visuals or UI. and callbacks occuring on a real-time thread that you want to use to trigger something (most commonly audio). In the former case, using ci::signals will work just fine, as long as you emit them on the main thread (can be done with something like app::App::dispatchAsync() or some sort of message queue). Here, you just want to schedule the callback as quick as possible and return to doing your real-time business.

The latter case is more intricate, and there usually isn’t a nice generic solution for it. @notlion can probably give some nice tips from experience here, and I can’t say I’ve ever personally needed to do this with MIDI, but I imagine that you need a mechanism more low level than signals to handle what should happen when a MIDI event fires on a real-time thread, and you want to schedule something else to occur in real-time (like start playback on an audio::GenNode or something).

To be clear about the current ci::signals limitations, emitting a signal from a background thread is fraught with peril. If a connection or disconnection happens from a different thread (ex. the main one) while the emit() is firing, you have trouble. This is especially difficult to avoid if you’re wisely using ci::signals::ScopedConnection or ConnectionList, as those will disconnect when they get destroyed and that is usually on the main thread. I’ve thought a bit about whether we could support this in the future and we probably could (either with a separate SignalMultithreaded or some sort of deferred connection / disconnection technique), but so far the use cases haven’t justified the time invested to properly implement and test. People seem to want to do it, though. :slight_smile:

If folks would like to explore more on how to design real-time callbacks for midi -> audio, I’d be happy to join in those discussions.

Also to note, fullbuild.sh isn’t deprecated, nor are the xcode projects shipped with cinder. For the time being, CMake is an alternative to using the xcode projects on OS X.


Thanks so much Rich for this detailed response!

I had come to very similar conclusions over the weekend regarding the Signals and different thread situations. I’ve updated the Cinder-MIDI2 block to have 2 signals to optionally connect to.

One gets called from a dispatchAsync to allow you to not worry about race conditions with editing data in the main app. As you say this is better for things like visualising midi where milliseconds of latency dont matter as much.

Then we have another Signal that can be connected directly on the midi thread but as you say, this is more complicated / fraught with danger due to potential race conditions… I’ve kept it there though as this was the original way the block dealt with the callbacks (though with boost::signals2 instead)… I guess the boost implementation was safer as boost::signals are safe against the connection/disconnection thread race?

As far as I know libraries like JUCE handle midi and audio on the same thread to do midi synthesis etc (though I might be wrong).

Thanks again for the info


I guess the boost implementation was safer as boost::signals are safe against the connection/disconnection thread race?

The boost implementation AFAIK uses a mutex to guard all operations. Not sure if we want to do this, at least not in the base ci::signals::Signal class.

As far as I know libraries like JUCE handle midi and audio on the same thread to do midi synthesis etc

This brings to mind a different possibility that seems worth exploring: what about some sort of audio::MidiSchedulerNode? Basically you would queue MIDI events to it from your MIDI thread, and the MidiSchedulerNode drains the events on the audio thread, sending out commands to whatever audio nodes need processing. That way the the MIDI events get handled as fast as possible on both MIDI and audio threads, and neither get blocked.

I’m not sure how the MIDI stuff is implemented under the hood, but I could also see possible to just use this MidiSchedulerNode to do the thread handling. Just need to make sure there are no blocking calls going on so you don’t stall the rest of the audio graph.