Image saved with copyWindowSurface() less vibrant


#1

Hello,

I regularly take screenshots of my app using copyWindowSurface() and have noticed lately that the screenshots look a little less saturated and vibrant than what I see in the app window. I looked at the docs and it seems copyWindowSurface() returns a Surface8u and maybe I need more range and a Surface16 or 32.

Is there an equivalent function to copyWindowSurface() but with a higher dynamic range? Or maybe my analysis is wrong and the issue comes from somewhere else?

Thanks!
Johan


#2

I don’t believe it has anything to do with the number of bits per channel. Because, in the end, your monitor typically only shows 8 bits per channel (or even fewer on cheaper LCD panels).

What I think might play a role, is gamma correction. But to know for certain, you should probably measure the colors of your running application and compare them with the image. There are tools to read color values from the monitor. I often use ColorPix and I am certain there are similar tools for Mac OSX as well. If you could measure some RGB values of both your app and the screenshot, could you post them here?

You may also want to experiment with saving an image as PNG or JPG and see if that makes a difference.

-Paul


#3

Hi Paul, thanks so much for your help once again :slight_smile:

Here are the comparison of 3 RGB values comparing app and screenshot:

  • Reds: app 255, 18, 42 vs screenshot 209, 49, 49
  • Blues: app 34, 167, 255 vs screenshot 81, 169, 249
  • Cyans: app 0, 246, 255 vs screenshot 117, 250, 253

I tried to save using JPG instead of PNG and both formats manifest the problem.

If it is indeed gamma correction, what would be a good way to get to a situation where what I see on screen in the app is what gets saved? If I apply inverse gamma correction in my shader, wouldn’t there still be a discrepancy?

Thank you!


#4

screenshot (saved using copyWindowSurface):

app window (saved using the MacOS screenshot tool directly on the app window):


#5

Are those measurements of the exact same pixel? Because if so, they don’t make any sense. I see a green value go from 18 to 49 and a blue value from 42 to 49. That would imply different conversion parameters per color channel. Also, if it were a gamma related problem, the value 255 would still be 255 after conversion. Same for 0, which seems to go to 117(!) in your measurements.

Could you perhaps share the code that creates the screenshot? Because this is very fishy.

-Paul

PS: normally, you’d only need gamma conversion when reading a JPG texture (they tend to be stored in sRGB), which would be done like this:

vec3 linearRGB = pow( texture( uTex, vTexCoord ).rgb, vec3( 2.2 ) );

and then again as the very last step of your shader, just before outputting to the main buffer:

fragColor.rgb = pow( linearRGB, vec3( 1.0 / 2.2 ) );

(in this sample I use a gamma value of 2.2. You could also simplify by using 2, in which case you can multiply the color with itself to linearize it and take the sqrt of the color to convert it back to non-linear color).


#6

Yeah I read the link you sent about gamma correction and it indeed doesn’t make sense in my case.

Here is the code creating the screenshot:

const char *homeDir = getenv( "HOME" );
auto path = string( homeDir ) + string( "/Desktop/screenshot_" ) + to_string( getElapsedSeconds() );    
writeImage( path + string( ".png" ), copyWindowSurface() );

#7

To make sure I was sampling the same pixel, I simplified my shader so it only outputs one color in screen space:

vec3 color = vec3(0.);
color = vec3(1.000, 0.024, 0.122);
oColor = vec4(color, 1.);

In the app window the pixel color is rgb(255, 6, 31), whereas the screenshot saved using the code above has rgb(234, 52, 48).

Even more simple case, when I set color to (1., 0., 0.), the saved screenshot shows an rgb value of (234, 50, 35), where as the app window properly shows (255, 0, 0).


#8

For what it’s worth, instead of using copyWindowSurface(), I tried to write my final output into a texture and call writeImage() from that:

writeImage( path + string( ".png" ), mMultipassShader.mMainFbo->getColorTexture()->createSource() );

Unfortunately same problem :confused: Not sure if that’s relevant but I’m on a MacBook Pro laptop with high density enabled. Thanks!


#9

I have been searching Cinder’s code for clues, but nothing stands out. At first I thought it also might be related to “limited color range”, where a projector only uses values between 16 and 235. But then I looked at your values again and they still don’t make any sense, going from 0 to 50 for green and 0 to 35 for blue.

So now I am thinking: are you using one of those godawful applications that change the colors of your monitor over the course of the day? It’s the only thing left on my list of hypotheses. That, or your computer is just messing with you.

Maybe you should step through the code in the debugger. Make sure you can also step through Cinder’s source code. Not sure how to do this on Mac OSX, probably by simply adding Cinder’s project to your project. Try to locate the exact moment when a color goes from making sense to Alice-in-Wonderland mode.

-Paul


#10

Haha thanks for your help Paul, I’ll keep poking at it and update the thread when I figure it out.

I do use Flux, but for all this stuff I was turning it off. But maybe it was still somehow having an impact. I’ll investigate that as well.


#11

Some random provocations: What’s your alpha channel set to? What are your glBlend settings? Does the problem persist with colors other than red? Can you output a gray gradient and compare it visually to a gamma corrected gray gradient? Does the problem persist if you read in a PNG and output it?

Maybe could also be something to do with the color profile of the saved image? Can you open it in photoshop and play with that?


#12

Finally found the issue with this, thanks so much to @heisters and @paul.houx for the precious help :slight_smile:

It turns out my Mac laptop was using the “Color LCD” display color profile, and not sRGB (System Preferences > Displays > Color). So this whole time the image data was being properly written by my Cinder app, but apps such as Preview or Photoshop were using that color profile to display it, whereas the Cinder app was bypassing it.