Window issue on OSX Mojave

Hi

I’m very new to Cinder. Trying some simple examples I’ve noticed that calling gl::drawSolidCircle(getWindowCenter(), 20.0);
places the object in the top right corner of the window instead of in the center.

Has anyone else noticed this behavior?

Can anyone suggest a workaround?

Hi Barry,

Could you try running the same code on pre-Mojave OSX and see if it’s a Mojave only issue? Also, is your code using Fbo? I did notice some retina scale related problem in glBlitFramebuffer in the first couple beta releases of Mojave, but that only happened to my other apps and I haven’t been able to reproduce in Cinder.

Phil

Hi Phil

I don’t have access to a pre Mojave machine I’m afraid. I’m not using any FBOs, the code I posted is my entire draw call. Should I open an issue on the repo?

Did you resize the window while running the application?

If so, you will have to adjust the projection matrix to the new window dimensions. The easiest way to do this is to add this call to your draw function:

void MyApp::draw()
{
    gl::clear();

    gl::setMatricesWindow( getWindowSize() ); // <-- this one

    gl::drawSolidCircle( getWindowCenter(), 20.0f );
}

A slightly better way is to only do this when the window resizes. Override the void resize(); function with this implementation:

void MyApp::resize()
{
    gl::setMatricesWindow( getWindowSize() );
}

If that’s not it, the issue might also be related to content scaling. Perhaps you’re running this on a high density monitor. Try this:

gl::drawSolidCircle( toPixels( getWindowCenter() ), toPixels( 20.0f ) );

(note: I did not try this code myself, so it might need some tweaks)

I hadn’t been resizing the window and

gl::drawSolidCircle( toPixels( getWindowCenter() ), toPixels( 20.0f ) );

hasn’t worked.

Could it be anything to do with the fact that I’m using an Intel Iris Graphics 6100 1536 MB?

Edit: Spotted this issue on the oF forums. There appears to be issues with oF on Mojave too.

If you send your full code someone can try it in pre Mojave, maybe.

That would be great.

#include "cinder/app/App.h"
#include "cinder/app/RendererGl.h"
#include "cinder/gl/gl.h"

using namespace ci;
using namespace ci::app;
using namespace std;

class CinderProjectApp : public App {
public:
    void setup() override;
    void mouseDown( MouseEvent event ) override;
    void update() override;
    void draw() override;
};

void CinderProjectApp::setup()
{
    cout << "height: " << getWindowHeight() << endl;
    cout << "width: " << getWindowWidth() << endl;
    cout << "center: " << getWindowCenter() << endl;
}

void CinderProjectApp::mouseDown( MouseEvent event )
{
}

void CinderProjectApp::update()
{
}

void CinderProjectApp::draw()
{
gl::clear( Color( 0, 0, 0 ) );
    gl::setMatricesWindow( getWindowSize() );
    gl::drawSolidCircle( getWindowCenter(), 20.0 );
    // gl::drawSolidCircle( toPixels( getWindowCenter() ), toPixels( 20.0f ) );
}

CINDER_APP( CinderProjectApp, RendererGl )

Like I said, it’s a very basic example.

I just gave a shot on both mojave(MacBook air with intel hd 5000) and high sierra(MacBook pro 2018) and they all works fine. but it might be that retina is enabled by default on mojave? since my MacBook air is none retina so I can’t test that, can you try explicitly disable highDensity in prepareSetting?

I tried it on sierra and it works fine. also with high density enabled on startup.

Thanks for your responses guys.

Stupid question: How do I enable / disable highDensity?

extend the CINDER_APP macro at the end of the source.

CINDER_APP( CinderProjectApp, RendererGl,
        []( CinderProjectApp::Settings *settings )
        {
            settings->setHighDensityDisplayEnabled( false );
        } )

Thanks gabor! setting setHighDensityDisplayEnabled to true did the trick!

Great. So it seems that the default is high density on Mojave as @seph thought, but Cinder does not detect it.

Just for people potentially running into this in the future, I didn’t have this issue on mojave until upgrading to xcode 10.

*edit. This has sent me down the rabbit hole. Apple have really screwed the pooch on this one (going by the sheer volume of broken projects i’ve found, though I suppose we were warned about the GL deprecation).

Anyway it seems there’s no way to disable retina-sized framebuffers with the current setWantsBestResolutionOpenGLSurface dance, which results in coordinate systems being all over the place. If you can afford to render at high dpi then enabling high density will work fine, but the default 1x mode is basically no longer “officially” supported in the context of cinder without mucking around with projection matrices and whatnot (This only fixes the coordinate space, the results are unacceptably blurry). I’m in the process of downgrading xcode to 9.4.1, which by all reports fixes the issue.

If you’re in the middle of a project that relies on the 1x scale of OpenGL surfaces, don’t upgrade. Better yet, throw away your mac. This company is dogshit. /rant

2 Likes

I feel sad for all Apple developers and Mac users. Such a pleasant OS to work with, but the tools and support are now lightyears behind most other platforms.

This seems to be addressed with the xcode 10.1 release.