Window resolution with highDensityDisplay

Hello :slight_smile:

I’m working on a Macbook Pro with Retina display (15-inch, 2017). According to the “About” section, the display has a 2880 x 1800 resolution.

My Cinder app explicitly sets its main window to be 1000 x 1000, but whether or not I call setHighDensityDisplayEnabled() the window takes most of my screen space, and when I save a screenshot using copyWindowSurface() it seems the result is always 1000 x 1000, where I was expecting something double, or at least higher. I read this article to try and get more clarity but I remain a little confused:
https://forum.libcinder.org/topic/rfc-retina-high-density-display-support

Am I missing something?

Thanks a lot for your help!
Johan

Have a squizz at this

1 Like

I think when you set window size you are dealing with points instead of pixels.

So say if you do settings->setWindowSize(500,500) and you have high density enabled,
then when you get window size using toPixels(getWindowWidth()) you will get 1000.

1 Like

Thanks for your answers which helped me understand different parts of the problem.

It turns out that even though the “About” section says the resolution of the display is 2880 x 1800, the actual resolution is much less to make things actually visible and usable. There is a scale that can be set in System Preferences > Display.

New follow-up. @seph I had tried what you suggested before when building the app with CMake in VSCode and it didn’t work. However, trying now in Xcode it all works as expected. With further logging I realize that even when specifying settings->setHighDensityDisplayEnabled(), the function getContentScale() still returns 1 with CMake, whereas it returns 2 when building and launching from Xcode.

Any idea what this could come from? Maybe it’s something with my CMake build script?

Continuing this discussion about high-density display here for those interested: