I want to render a depth map of a scene in Blender (v2.65a, using the Cycles engine) that uses the distance to the image plane (or any plane parallel to it, e.g. yz-plane) as the depth value.
It is possible to use the composite nodes to render a simple depth map as described in the documentation. The problem here is, that the z-values used are based on the distance to the camera, not the image plane. This means that lines parallel to the image plane will have varying z-values depending on their location in the rendered image, as can be seen in this example (rendering depicts the front of a cube, values have been normalized):
I want a depth map that holds the same depth for points on a plane parallel to the image plane (without changing the overall projection, since the depth map still has to coincide with the normal rendering).
This could be achieved by post-processing the current depth map using a script that corrects the values according to their angle from the view direction, but I was wondering whether there is a way to get the right information right out of the rendering process.
So is there a way to use composite nodes (or a similar Blender feature) to render a depth map based on the distance to the image plane?