Hi there. I’ve come across something strange when drawing point sprites (using gl_PointCoord
in a fragment shader). When I’m drawing to an FBO versus drawing directly to the default (screen) framebuffer, the origin of the point texture coordinates seems to be flipped. I can fix this by explicitly setting the GL_POINT_SPRITE_COORD_ORIGIN
point parameter, but even that’s behaving in a way that doesn’t makes sense to me.
I have a test program which draws the same thing – first to an FBO, then to the screen. So the following program works “correctly” (or at least consistently) – the sprites each have red at the top and blue at the bottom. (I don’t really care whether the sprites use lower-left or upper-right origin; I can fix it in the shader as long as it’s consistent when switching between FBO and direct rendering!)
#include "cinder/app/App.h"
#include "cinder/app/RendererGl.h"
#include "cinder/gl/gl.h"
using namespace ci;
using namespace ci::app;
class FboSpriteFlipApp : public App {
public:
void setup() override;
void update() override;
void draw() override;
private:
void render(ivec2 const& windowSize);
gl::FboRef mFbo;
CameraPersp mCam;
gl::GlslProgRef mShader;
};
void FboSpriteFlipApp::setup()
{
mFbo = gl::Fbo::create(320, 480);
mCam.lookAt(vec3(1,0.5f,5), vec3(0,-0.5f,0), vec3(0,1,0));
glEnable(GL_PROGRAM_POINT_SIZE);
mShader = gl::GlslProg::create( gl::GlslProg::Format()
.vertex(CI_GLSL(330,
uniform mat4 ciModelViewProjection;
in vec4 ciPosition;
void main(void) {
gl_Position = ciModelViewProjection * ciPosition;
gl_PointSize = 24;
}
))
.fragment(CI_GLSL(330,
out vec4 oColor;
void main(void) {
oColor = mix(vec4(1,0,0,1), vec4(0,0,1,1), gl_PointCoord.y);
}
)));
}
void printSpriteCoordOrigin(char const* prefix)
{
GLint psco = 0;
glGetIntegerv(GL_POINT_SPRITE_COORD_ORIGIN, &psco);
char const* name = nullptr;
switch (psco) {
case GL_LOWER_LEFT:
name = "LOWER_LEFT";
break;
case GL_UPPER_LEFT:
name = "UPPER_LEFT";
break;
default:
name = "<unknown>";
}
console() << prefix << " point sprite coord origin: " << name << '\n';
}
void FboSpriteFlipApp::render(ivec2 const& windowSize)
{
mCam.setAspectRatio(windowSize.x / static_cast<float>(windowSize.y));
gl::ScopedViewport svp(ivec2(0), windowSize);
gl::clear(Color::black());
gl::setMatrices(mCam);
gl::color(Color::white());
gl::ScopedPolygonMode spm(GL_POINT);
gl::ScopedGlslProg sgp(mShader);
gl::drawSphere(vec3(0, 0, 0), 1);
}
void FboSpriteFlipApp::update()
{
gl::ScopedFramebuffer sfb(mFbo);
printSpriteCoordOrigin("fbo");
render(mFbo->getSize());
}
void FboSpriteFlipApp::draw()
{
printSpriteCoordOrigin("window");
glPointParameteri(GL_POINT_SPRITE_COORD_ORIGIN, GL_UPPER_LEFT);
printSpriteCoordOrigin("modified window");
render(getWindowSize() / ivec2(2,1));
// Why is it necessary to switch this back here, and not just before rendering to the FBO?
glPointParameteri(GL_POINT_SPRITE_COORD_ORIGIN, GL_LOWER_LEFT);
Area subWindowBounds(getWindowWidth()/2, 0, getWindowWidth(), getWindowHeight());
gl::ScopedViewport svp(subWindowBounds.getUL(), subWindowBounds.getSize());
gl::ScopedScissor ssc(subWindowBounds.getUL(), subWindowBounds.getSize());
gl::clear(Color::gray(0.25f));
gl::setMatricesWindow(subWindowBounds.getSize());
gl::ScopedColor scol(Color::white());
auto targetRect = Rectf(mFbo->getBounds()).getCenteredFit(Area(0,0, subWindowBounds.getWidth(), subWindowBounds.getHeight()), true).scaledCentered(1.0f);
gl::draw(mFbo->getColorTexture(), targetRect);
}
CINDER_APP( FboSpriteFlipApp, RendererGl )
While running, it prints (for each frame):
fbo point sprite coord origin: LOWER_LEFT
window point sprite coord origin: LOWER_LEFT
modified window point sprite coord origin: UPPER_LEFT
However, if I set the origin to LOWER_LEFT just before drawing the FBO, and don’t reset it immediately after drawing to the window, the printed output is identical, but the output is wrong…
void FboSpriteFlipApp::update()
{
gl::ScopedFramebuffer sfb(mFbo);
// Set the origin to LOWER_LEFT just prior to drawing FBO
glPointParameteri(GL_POINT_SPRITE_COORD_ORIGIN, GL_LOWER_LEFT);
printSpriteCoordOrigin("fbo");
render(mFbo->getSize());
}
void FboSpriteFlipApp::draw()
{
printSpriteCoordOrigin("window");
glPointParameteri(GL_POINT_SPRITE_COORD_ORIGIN, GL_UPPER_LEFT);
printSpriteCoordOrigin("modified window");
render(getWindowSize() / ivec2(2,1));
//glPointParameteri(GL_POINT_SPRITE_COORD_ORIGIN, GL_LOWER_LEFT);
... the rest same as before ...
Here the direct-to-window version (on the left half of the view) keeps red on top of each sprite, but the FBO version (right-hand side) has each sprite going from blue on top to red on the bottom. But the value of the point parameter (COORD_ORIGIN) is the same in each tested case, so it prints out (as before):
fbo point sprite coord origin: LOWER_LEFT
window point sprite coord origin: LOWER_LEFT
modified window point sprite coord origin: UPPER_LEFT
At first, I thought this implied that something outside was using/changing something with sprites(?). But I can find no use of glPointParameter
(or any use of sprites, glPointSize
, etc) in Cinder’s code. Instead, I think it’s likely related to originUpperLeft
(by default) for the Cinder matrices, whereas OpenGL viewports expect the origin to be at lower left… But on the other hand, what I’m doing seems like the most “normal” usage of Cinder, and I wouldn’t expect a difference between FBOs and regular rendering, when the render code is identical (e.g. setting the camera matrix is done the same way). I guess it must be the case that the FBO is rendered upside down, and then flipped when the camera matrix is applied to compensate for it?
Any ideas on how to make sense of this and do it the right way (preferably without needing to change the origin between FBO and screen drawing)?
Thanks,
Glen.