0

So I have an FBO with a texture attached as its depth attachment. I then bind this texture to a shader so I can add some post-processing effects related to depth, like an outline effect.

However, this depth texture is not normalized. The minimum and maximum values can be anything from -1.0 to 1.0. Is there a way to normalize it so that the minimum depth on the texture is set to -1.0, the maximum depth is set to 1.0, and everything else is just a linear interpolation between the two?

EDIT: Sorry if I wasn't clear. Right now, my depth texture has values ranging from uncontrollable numbers, say from -0.5432 to 0.123. I'd like to change this so -0.5432 becomes -1.0, 0.123 becomes 1.0, and everything else is interpolated between the two.

user5074736
  • 138
  • 8
  • I don't understand. What's different from what you have `[-1,1]` to what you want `[-1,1]` – Ripi2 May 28 '19 at 17:41
  • 2
    What is *"normalized"* in this context? The depth in the depth buffer in general is in range [0, 1]. – Rabbid76 May 28 '19 at 17:42
  • @Rabbid76 The depth buffer is in the range [-1,1]. I clarified in my edit. – user5074736 May 28 '19 at 17:45
  • Just scale and translate the depth values in the shader, right before writting them into the FBO – Ripi2 May 28 '19 at 17:45
  • @Ripi2 But how do I do that? The minimum and maximum depth values on screen change depending on where the camera is, so I can never know what they are. -0.5432 and 0.123 were just example numbers. – user5074736 May 28 '19 at 17:50
  • Normally the depth buffer has values in `[-1,1]` range, that's "NDC". If you want a different range you must know the min & max values of your range. If you don't know them, then you need to read that depth buffer, find them, and then use them, perhaps in another draw-call with some special shader uniforms. – Ripi2 May 28 '19 at 17:54
  • 2
    @Ripi2 *"Normally the depth buffer has values in [-1,1] range, that's "NDC"* nonsense, the normalized device z coordinate is not the depth value. By default the depth range is [0, 1], but there is [`glDepthRange`](https://www.khronos.org/registry/OpenGL-Refpages/gl4/html/glDepthRange.xhtml), too. – Rabbid76 May 28 '19 at 18:08
  • @Ripi2 so there's no way other than calling `glReadPixels` every frame? Damn, I guess I'll find another method. – user5074736 May 28 '19 at 18:22
  • 2
    @user5074736: "*Right now, my depth texture has values ranging from uncontrollable numbers, say from -0.5432 to 0.123.*" No, it doesn't. It is *impossible* for a depth value generated from OpenGL to have a negative value. The API flat out does not allow it. And most depth formats are unsigned normalized integers, which *cannot* represent negative values. If you're getting negative numbers, then you're using a float value and you probably didn't clear the depth buffer. Either that, or your code is invoking undefined behavior somehow. – Nicol Bolas May 28 '19 at 18:27
  • @Nicol Bolas I'm sorry I'm mistaken. Anyways, the example still stands. I think on average my depth values range from something like 0.2 to 0.6 but it changes depending on scene geometry and camera position – user5074736 May 28 '19 at 18:36
  • 2
    @user5074736: Of course the depth changes. It's the depth of the scene geometry relative to the camera. If the camera moves, then the depth of the visible objects change. If the scene geometry moves around, the depth relative to the camera of the various pixels changes. I don't understand the issue. – Nicol Bolas May 28 '19 at 18:37
  • I want a texture where the lowest visible depth value is exactly 0, the highest visible depth value is 1, and everything else is interpolated. Sorry if I haven't been clear. – user5074736 May 28 '19 at 18:39
  • 1
    Well, if you _really_ want such a behavior, you could implement it. Either by adjusting the near and far clipping planes to perfectly match the z extend of your scene, or by finding the main and max of the depth buffer after the render pass. But I really thing you're on the wrong track here. – derhass May 28 '19 at 19:09
  • 1
    see [How to correctly linearize depth in OpenGL ES in iOS?](https://stackoverflow.com/a/42515399/2521214) that should be easiest to implement and also most accurate ... linearizing after render is too late and with bad rounding errors – Spektre May 29 '19 at 07:02

0 Answers0