2

I'm implementing deferred shading in my OpenGL app, and rather than waste waaay too much memory storing position information, I want to reconstruct the view-space position in the fragment shader using information from the depth buffer. It appears as though I have correct x/y, though I'm not completely sure, but I know that my z information is out the window. Here's the part of the fragment shader responsible for reconstructing the position:

vec3 reconstructPositionWithMat(void)
{
    float depth = texture2D(depthBuffer, texCoord).x;
    depth = (depth * 2.0) - 1.0;
    vec2 ndc = (texCoord * 2.0) - 1.0;
    vec4 pos = vec4(ndc, depth, 1.0);
    pos = matInvProj * pos;
    return vec3(pos.xyz / pos.w);
}

matInvProj is the inverse of my projection matrix (calculated on the CPU and uploaded as a uniform mat4). When I try to render my position information (fragColor = vec4(position, 1.0);), the screen is black in the lower-left corner, red in the lower-right corner, green in the upper-left corner, and yellow in the upper-right corner. This is roughly what I would expect to see if my depth was uniformly 0.0 across the entire scene, which it obviously should not be.

What am I doing wrong?

Haydn V. Harach
  • 1,265
  • 1
  • 18
  • 36
  • I fixed the function (added pos.xyz / pos.w), and now it seems to correctly transform the x/y according to what I would expect, but z is still 0. If I use pos.w instead of pos.z as my depth, it seems to be inverted (1 = at camera, 0 = far plane), and definitely non-linear (or maybe it is linear and it's going into negatives, I can't tell). – Haydn V. Harach Mar 09 '14 at 18:32
  • Another interesting development... I created a buffer to hold positions directly, just to test it out, and when I read this buffer and display it... I get the same result. The same black/red/yellow/green as my fragment shader above, and no blue to be seen. – Haydn V. Harach Mar 09 '14 at 20:47
  • Are you sure Z is zero and not various negative values corresponding to the camera space Z position of the points (since by convention cameras look along the negative Z axis)? – MikeMx7f Mar 09 '14 at 22:20
  • I tried negating z (`pos.z = -pos.z;`) to no avail. Then, I tried adding 1 (`pos.z += 1.0;`), and now I get some blue but only when something is extremely close to the camera (definitely non-linear z). – Haydn V. Harach Mar 09 '14 at 23:25
  • ...Or maybe it was linear all along and I was just looking at it wrong. I changed the way it's rendered (`fragColor = vec4(position.xy, -(position.z + 1.0), 1.0)`), and now it shows me roughly what I would expect to see. Not only that, but it perfectly matches the output from my position buffer (at least as far as my eye can tell). – Haydn V. Harach Mar 10 '14 at 00:07
  • @HaydnV.Harach, how did you come up with this "adding 1"? – Eduardo Reis Jun 06 '14 at 14:44

1 Answers1

0

I found out the problem, I was simply interpreting the data wrong. By rendering it as (fragColor = vec4(position.xy, -(position.z + 1.0), 1.0)) the results were as I expected them to. In addition, when I compared it to the buffer's position (0.5 + (reconstructedPos - bufferPos)), my scene ended being mostly gray, except for far away regions where depth precision was an issue, which is what I expected from correct reconstruction results.

Haydn V. Harach
  • 1,265
  • 1
  • 18
  • 36