Writing to a texture via compute shaders?

Ok, so i did some digging and found that the errors & warnings were being thrown as these types (GL_IMAGE_2D, GL_IMAGE_3D) aren’t implemented in some of the gl functions. I’ve added them to the following files & functions:

gl/ConstantConversions.cpp

std::string	constantToString( GLenum constant )
...
#if ! defined( CINDER_GL_ES_2 )
 sSymbols[GL_IMAGE_2D] = "GL_IMAGE_2D";
 sSymbols[GL_IMAGE_3D] = "GL_IMAGE_3D";
...
 uint8_t typeToBytes( GLenum type )   
...
     #if ! defined( CINDER_GL_ES_2 )
     case GL_IMAGE_2D:					return sizeof(int); break;
     case GL_IMAGE_3D:					return sizeof(int); break;
...

gl/GlslProg.cpp

bool GlslProg::checkUniformType( GLenum uniformType ) const
...
 #if ! defined( CINDER_GL_ES_2 )
 case GL_IMAGE_2D:					return std::is_same<T, int32_t>::value;
 case GL_IMAGE_3D:					return std::is_same<T, int32_t>::value;
...

This clears up any of my warnings/errors fine, but nothing happens to my texture - suspect this is most likely my problem now, so I’ll continue doing some digging.
I plan to go the pure gl route and then do it the cinder-way once I’ve got something working as it should.

If anyone can see why my uniform value wouldnt work though, let me know. It should just be passing the texture unit to the shader which afaik it should now be doing?

edit: sorry for the weird formatting on the code samples, can’t seem to indent properly.