I'm rendering a scene of polygons to multiple render targets so that I can perform postprocessing effects. However, the values I'm setting in the fragment shader don't seem to be accurately reflected in the pixel shader.
Right now the pipeline looks like this:
- Render basic polygons (using simple shader, below) to an intermediate buffer
- Render the buffer as a screen-sized quad to the screen.
I'm using WebGL Inspector (http://benvanik.github.com/WebGL-Inspector/) to view the intermediate buffers (created using gl.createFrameBuffer()
).
I have a very simple fragment shader when drawing the polygons, something like this:
gl_FragColor = vec4(1, 0, 0, 0.5);
And this before my draw call:
gl.disable(gl.BLEND);
I would expect this to create a pixel in the buffer with a value of exactly (255,0,0,128), but in fact, it creates a pixel with the value of (255,0,0,64) -- half as much alpha as expected.
The program is fairly large and tangly, so I'll update the post with specific details if the answer isn't immediately apparent.
Thanks!