Cinder:audio and playing video at the same time (iOS)?

I am trying to play a movie and do an audio analysis at the same time, using cinder:audio.

A movie contains sound.

My naive approach to instantiate an audio graph using a default input device

    auto ctx = audio::master();
   mInputDeviceNode = ctx->createInputDeviceNode();
    cout<< "Using " << mInputDeviceNode->getDevice() -> getName() << " audio input" <<endl;
    auto monitorFormat = audio::MonitorSpectralNode::Format().fftSize( 1024 ).windowSize( 512 );
    mMonitorSpectralNode = ctx->makeNode( new audio::MonitorSpectralNode( monitorFormat ) );
    mInputDeviceNode >> mMonitorSpectralNode;

resulted in

Assertion failed: (status == noErr), function process, file /Users/eight/repos/cinder_0.9.0_mac/src/cinder/audio/cocoa/ContextAudioUnit.cpp, line 281.

Ideally though I would like to have the movie audio serve as my audio graph input. Is this possible?

If not, is it possible to setup the default audio input device to live together with playing a movie?


Not sure, haven’t tried myself. Perhaps search stackoverflow or something like that about using both AVFoundation and AVAudioSession / Audio Units at the same time. It might be that you need a different session category, but that’s just a wild guess. Looking for what the status error message message says from that failed assertion might give you an idea.


This suggests, one could “tap” into AVPlayer (which I see is the player underneath cinder’s MoviePlayerGl) in order to get to the audio buffers. Then, I think, a “tapping” Audio Buffer could be written to serve as a link between AVPlayer’s audio and cinder::audio realm.


Yea writing an audio::Node that could read samples from a video file via AVFoundation would be rad, and could potentially sidestep the issue of both AVFoundation and cinder::audio trying to play sound at the same time. If you want to give it a shot, you can take a look at the NodeSubclassing sample as a starting point.