Texture2d from Data without clamping

Hi,

I am trying to create a texture from data as shown below:

gl::Texture2d::Format format;
gl::Texture2dRef texDepthSpace = gl::Texture2d::create( &dsPoints[ 0 ], GL_RG, kColorWidth, kColorHeight, format.dataType( GL_FLOAT ) );

Where dsPoints is a vector of { float X, float Y }. In my shader, the values seem to be clamped at 1.0. How can I prevent clamping?

Thank you,
Bala.

GL_RG is an 8-bit per channel format. The values are treated as normalized “floats” where a value of 255 becomes 1.0 and 127 is roughly 0.5. If you want the values to be treated as unsigned chars, use GL_RG8UI. For signed chars, use GL_RG8I. For 32-bit floats, use GL_RG32F, but be advised that using more than 32 bits in total (in this case 64 bits) may cause some performance degradation.

See also:
https://www.opengl.org/sdk/docs/man/docbook4/xhtml/glTexImage2D.xml

I had tried using GL_RG32F where I have used GL_RG. But, then the texture value that I read becomes zero. In my fragment shader:

uniform sampler2D uDsMap;
in vec2 TexCoord;
void main( void )
{
vec2 dsc = texture( uDsMap, vec2( TexCoord.x, 1.0 - TexCoord.y ) ).rg;
if( dsc.r == 0.0 )
oColor = vec4( 1.0, 0.0, 0.0, 1.0 );
}

I get a fully red image. But if I use GL_RG, I can get the values. Looking at glTexImage2D:

It uses “mInternalFormat” ( GL_RGBA ) where GL_RG32F should go (according to my understanding of the api docs). And GL_RG32F is not one of the acceptable symbolic values for dataFormat. I assume I am overlooking something simple.

Thank you for your time!

Oops. Got it working. I needed to use the set the internalFormat of the texture to make it work.

gl::Texture2d::Format format;
format.setInternalFormat( GL_RG32F );
gl::Texture2dRef texDepthSpace = gl::Texture2d::create( &dsPoints[ 0 ], GL_RG, kColorWidth, kColorHeight, format.dataType( GL_FLOAT ) );

Thanks!