17

I was trying to run an OpenGL code, it didn't have GL_DEPTH_BUFFER_BIT cleared in glClear(), because of which I couldn't render my scene. I added this bit, and the scene was rendered. Why is it necessary to use this clear bit?

I may know the reason for this, to clear the depth buffers values used by GPU previously, but i just want to confirm.

2am
  • 699
  • 1
  • 7
  • 25

1 Answers1

29

The Depth Buffer holds the "depth" of the pixel in the scene. When OpenGL renders your geometry, each fragment (pixel) is compared against the depth buffer's value at that point. If that fragment has a z value lower than the one in the buffer, it becomes the new lowest value, and thus the pixel to be rendered. If not, don't render it - there's something closer that's blocking it. That's the gist of it - you can read into the specifics yourself.

Now, what happens when the scene changes? You want to clear the screen so you redraw everything, but you also want to clear the depth buffer. Why? Because otherwise all the new pixels will be compared against the depth values from the previous frame. That doesn't make sense - they should be compared against those in the frame they're in! You are correct in your reasoning.

GraphicsMuncher
  • 4,583
  • 4
  • 35
  • 50
  • 4
    You could also alternate the depth range and the direction of the depth test each frame. But it turns out that clearing the depth buffer is also a performance optimization on modern hardware - they implement lossless Z-Buffer (and color buffer) compression by splitting the framebuffer into tiles, when cleared the GPU needs only read/write a few bits per-tile when fetching values for areas of the screen that have nothing in them. So the old hack that people used to use to avoid clearing the color buffer and depth buffer actually hurts performance on modern GPUs ;) – Andon M. Coleman Oct 19 '13 at 20:59
  • Thats exactly what I thought the reason is :) Perfect ans. – 2am Oct 20 '13 at 10:36