1

So I have an app where I have initialized THREE.WebGLRenderer with logarithmicDepthBuffer=true.. I also have a THREE.WebGLRenderTarget i render to, where I have set the depthTexture to THREE.DepthTexture, with a type of THREE.UnsignedIntType;

I am wondering if the depthTexture value now contains the logarithmic depth, or if it still contains the same as before (when logarithmic depth was set to false).

And if it does contain the logarithmic depth.. does anyone know the glsl code required to convert it back to a linear depth (from [0...1], where 1 == farPlane)?

I currently get the depth texture value based on this example https://github.com/mrdoob/three.js/blob/master/examples/webgl_depth_texture.html

Eric
  • 148
  • 1
  • 9
  • It will contain the modified depth as the depth is set in the fragment shader. Why do you need to change it? – Rasheduzzaman Sourov Jan 18 '18 at 09:50
  • see [How to correctly linearize depth in OpenGL ES in iOS?](https://stackoverflow.com/a/42515399/2521214) I do not code in **three.js** so I am not comfortable to mark this as duplicate but I assume the code/method would be the same on your platform too – Spektre Jan 18 '18 at 10:54
  • Rasheduzzaman Sourov are you guessing this, or do you know as a fact? When i turn the value into linear (using the code from that example), it looks exactly the same with logarithmicbuffer set to true or false – Eric Jan 18 '18 at 16:04
  • @Eric to notify user you need to add `@` before the nick. Yes depth buffer values are the ones set in fragment shader with `gl_FragDepth=....;` beware the line should be before the final output color computation otherwise it could get optimized out. – Spektre Jan 19 '18 at 08:27
  • Ah yep.. i had a custom shader and though it would write the depth magically by itself, but it did not. I just needed to add the proper logarithmic depth value that everything else was writing. After that, i was able to reverse it to recover whatever value i wanted from the depthtexture – Eric Jan 23 '18 at 23:27

0 Answers0