Quicktime timestamp

Hello,
I use Cinder-Hap2 with quicktime player, the best efficient video player on windows that I know !
(no problem to play in loop a 4k video hap Q at 60im/s)

Problem :

I have a problem to synchronize video with displays refresh rate.
I have a little lag sometimes because I think that updateFrame function into MovieBase class don’t use timestamp.
It’s too annoying for me !
This is a shame ! I think with low effort we can arrange this !

void MovieBase::updateFrame()
{
getObj()->lock();

::MoviesTask( getObj()->mMovie, 0 );
if( (QTVisualContextRef)getObj()->mVisualContext ) {
::QTVisualContextTask( (QTVisualContextRef)getObj()->mVisualContext );
if( ::QTVisualContextIsNewImageAvailable( (QTVisualContextRef)getObj()->mVisualContext, nil ) ) {
getObj()->releaseFrame();

  	CVImageBufferRef newImageRef = NULL;
  	long tv = ::GetMovieTime( getObj()->mMovie, NULL );
  	OSStatus err = ::QTVisualContextCopyImageForTime( (QTVisualContextRef)getObj()->mVisualContext, kCFAllocatorDefault, NULL, &newImageRef );
  	if( ( err == noErr ) && newImageRef )
  		getObj()->newFrame( newImageRef );
  	if( getObj()->mNewFrameCallback && newImageRef ) {
  		
  		(*getObj()->mNewFrameCallback)( tv, getObj()->mNewFrameCallbackRefcon );
  	}
  }

}
getObj()->unlock();
}

For moment timestamp is assigned to NULL !

explanation :

http://www.cimgf.com/2008/09/10/core-animation-tutorial-rendering-quicktime-movies-in-a-caopengllayer/
:

QTVisualContextCopyImageForTime(qtVisualContext,
NULL,
timeStamp,
&currentFrame);

But Wait! What TimeStamp?

If you asked this question, then you are a very astute reader. In order to obtain the next image, we simply passed the CVTimeStamp parameter, timeStamp to our call to QTVisualContextCopyImageForTime. But how do we even have a timestamp? Isn’t that something we need to get from a display link? If you’re asking what is a display link at this point, take a look at the Core Video Programming Guide which states:

To simplify synchronization of video with a display’s refresh rate, Core Video provides a special timer called a display link. The display link runs as a separate high priority thread, which is not affected by interactions within your application process.

In the past, synchronizing your video frames with the display’s refresh rate was often a problem, especially if you also had audio. You could only make simple guesses for when to output a frame (by using a timer, for example), which didn’t take into account possible latency from user interactions, CPU loading, window compositing and so on. The Core Video display link can make intelligent estimates for when a frame needs to be output, based on display type and latencies.

I will provide a more complete answer to the question in the future as I am still studying it myself, however, I will mention that a display link callback is unnecessary in this context as the CAOpenGLLayer is providing this for us. The timestamp field is all we need in order to get the current frame assuming that the movie is playing back.

Solution :

The idea is to compute timestamp inttelligently between display refresh rate and video refresh rate for minimise lag.
Any idea ?