13

I have frame buffer, with depth component and 4 color attachments with 4 textures

I draw some stuff into it and unbind the buffer after, using 4 textures for fragment shader (deferred lighting). Later i want to draw some more stuff on the screen, using the depth buffer from my framebuffer, is it possible?

I tried binding the framebuffer again and specifying glDrawBuffer(GL_FRONT), but it does not work.

ShPavel
  • 962
  • 1
  • 10
  • 17

3 Answers3

15

Like Nicol already said, you cannot use an FBOs depth buffer as the default framebuffer's depth buffer directly.

But you can copy the FBO's depth buffer over to the default framebuffer using the EXT_framebuffer_blit extension (which should be core since GL 3):

glBindFramebuffer(GL_READ_FRAMEBUFFER, fbo);
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, 0);
glBlitFramebuffer(0, 0, width, height, 0, 0, width, height, 
                  GL_DEPTH_BUFFER_BIT, GL_NEAREST);

If this extension is not supported (which I doubt when you already have FBOs), you can use a depth texture for the FBO's depth attachment and render this to the default framebuffer using a textured quad and a simple pass through fragment shader that writes into gl_FragDepth. Though this might be slower than just blitting it over.

Christian Rau
  • 45,360
  • 10
  • 108
  • 185
12

I just experienced that copying a depth buffer from a renderbuffer to the main (context-provided) depth buffer is highly unreliable when using glBlitFramebuffer. Just because you cannot guarantee the format does match. Using GL_DEPTH_COMPONENT24 as my internal depth-texture-format just didn't work on my AMD Radeon 6950 (latest driver) because Windows (or the driver) decided to use the equivalent to GL_DEPTH24_STENCIL8 as the depth-format for my front/backbuffer, although i did not request any stencil precision (stencil-bits set to 0 in the pixel format descriptor). When using GL_DEPTH24_STENCIL8 for my framebuffer's depth-texture the Blitting worked as expected, but I had other issues with this format. The first attempt worked fine on NVIDIA cards, so I'm pretty sure I did not mess things up.

What works best (in my experience) is copying via shader:

The Fragment-Program (aka Pixel-Shader) [GLSL]

#version 150

uniform sampler2D depthTexture;
in vec2 texCoords; //texture coordinates from vertex-shader

void main( void )
{
    gl_FragDepth = texture(depthTexture, texCoords).r;
}

The C++ code for copying looks like this:

glDepthMask(GL_TRUE);
glColorMask(GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE);
glEnable(GL_DEPTH_TEST); //has to be enabled for some reason
glBindFramebuffer(GL_FRAMEBUFFER, 0);
depthCopyShader->Enable();
DrawFullscreenQuad(depthTextureIndex);

I know the thread is old, but it was one of my first results when googeling my issue, so I want to keep it as consistent as possible.

Marius
  • 2,234
  • 16
  • 18
  • Thanks, solved the problem with my integrated intel graphics. – joesmoe891 Aug 08 '13 at 07:45
  • 1
    Is there a way to get the default buffer format after its created and use to create the fbo? – Luke B. Sep 07 '13 at 14:15
  • I also wonder the same thing as @LukeB. as it would allow blitting between two framebuffers knowing that the formats match. That would also work for the case when we want to blit from the default framebuffer which we can't do with a shader, since it's depth buffer is not accessable. – Mattias F Dec 06 '16 at 15:01
  • Had the same problem. – aboutqx Nov 05 '18 at 14:54
2

You cannot attach images (color or depth) to the default framebuffer. Similarly, you can't take images from the default framebuffer and attach them to an FBO.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982