Writing to a texture via compute shaders?


#1

Hi, I’m hoping to experiment drawing to a 3D Texture sampler (or others!) using a compute shader.

I’m hitting a bit of a roadblock & I’m not sure where the problem is. Some info:

  • The 3d tex is valid, & the compute shader is compiling
  • I’m on windows so OpenGL 4.2 + for imagestore is accessible
  • I’ve enabled opengl 4.3 in my app

Unfortunately what I’m hitting, is I cant seem to figure out how to send my tex to the shader as a uniform for the usage of imageStore - as an image3d or image2d uniform type. I’m always hitting the following:

error: cinder::gl::typeToBytes[605] Unknown gl type constant 0x904d
error: cinder::gl::GlslProg::CheckUniformType[1450] unknown uniform type
Warning: cinder::gl::GlslProg::logUniformWrongType[971] Uniform type mismatch for "img_output", expected 0x904d and received (uint32_t)

& on draw:
error: cinder::gl::GlslProg::CheckUniformType[1450] unknown uniform type etc

My basic compute shader section looks like this:


gl::ScopedGlslProg prog(mUpdateProg);
GLuint tex_id = 32;
mTex3d->bind(tex_id);
mUpdateProg->uniform("destTex", tex_id);
gl::dispatchCompute(256,256,3);
gl::memoryBarrier(GL_SHADER_STORAGE_BARRIER_BIT);

& my compute shader right now is this

#version 430 compatibility
layout(r8) uniform image3D destTex;

  
void main() 
{
    ivec3 storePos = ivec3(gl_GlobalInvocationID.xyz);
    imageStore(destTex, storePos, vec4(3.0/255.0));
}

I’ve tried the same with a 2D image too (uniform image2D destTex) with the same errors.

If anyone has any advice or pointers it’d be greatly appreciated!


#2

Ok, so i did some digging and found that the errors & warnings were being thrown as these types (GL_IMAGE_2D, GL_IMAGE_3D) aren’t implemented in some of the gl functions. I’ve added them to the following files & functions:

gl/ConstantConversions.cpp

std::string	constantToString( GLenum constant )
...
#if ! defined( CINDER_GL_ES_2 )
 sSymbols[GL_IMAGE_2D] = "GL_IMAGE_2D";
 sSymbols[GL_IMAGE_3D] = "GL_IMAGE_3D";
...
 uint8_t typeToBytes( GLenum type )   
...
     #if ! defined( CINDER_GL_ES_2 )
     case GL_IMAGE_2D:					return sizeof(int); break;
     case GL_IMAGE_3D:					return sizeof(int); break;
...

gl/GlslProg.cpp

bool GlslProg::checkUniformType( GLenum uniformType ) const
...
 #if ! defined( CINDER_GL_ES_2 )
 case GL_IMAGE_2D:					return std::is_same<T, int32_t>::value;
 case GL_IMAGE_3D:					return std::is_same<T, int32_t>::value;
...

This clears up any of my warnings/errors fine, but nothing happens to my texture - suspect this is most likely my problem now, so I’ll continue doing some digging.
I plan to go the pure gl route and then do it the cinder-way once I’ve got something working as it should.

If anyone can see why my uniform value wouldnt work though, let me know. It should just be passing the texture unit to the shader which afaik it should now be doing?

edit: sorry for the weird formatting on the code samples, can’t seem to indent properly.