0

I am trying to model my scene in light space as preperation of my shadow mapping, however I am massively confused.

The line that calculates the position in my shader: gl_Position = light_projection_matrix * light_view_matrix * light_model_matrix * position;, where position is a model coordinate (equal to world coordinate in my case) given by 3 floats, I am reading it as vec4 and that part has proven to be working.

Now I use the following, coded in Java, for my matrices:

lightModelMatrix.identity().multiply(modelMatrix);
lightViewMatrix.identity().lookAt(new Vector3f(-20f, 7.5f, -20f), Vector3f.O, Vector3f.Y);
lightProjectionMatrix.identity().frustum(-1f, 1f, -1f, 1f, -200f, 200f);

Where modelMatrix is identity()

I think that the issue is that the lookAt matrix is changing the .w component of my vector, I am not sure if that should be happening, I only know that the projection matrix must calculate a .w component.

So this is the code for my lookAt:

public Matrix4f lookAt(final Vector3f eye, final Vector3f at, final Vector3f up) {
    Vector3f zAxis = at.subtract(eye).normalized();
    Vector3f xAxis = up.cross(zAxis).normalized();
    Vector3f yAxis = zAxis.cross(xAxis);
    return multiply(
        xAxis.getX(),   xAxis.getY(),   xAxis.getZ(),   -xAxis.dot(eye),    //X column
        yAxis.getX(),   yAxis.getY(),   yAxis.getZ(),   -yAxis.dot(eye),    //Y column
        zAxis.getX(),   zAxis.getY(),   zAxis.getZ(),   -zAxis.dot(eye),    //Z column
        0.0f,           0.0f,           0.0f,           1.0f                //W column
    );
}

All matrices in my code are defined in column major order.

Also, what is a good way to debug the gl_Position that gets calculated in the Vertex Shader to see if an issue is there?

skiwi
  • 66,971
  • 31
  • 131
  • 216

1 Answers1

0

In the projection matrix, the near/far values should both be positive.

In the view matrix, your zAxis should be pointing backwards (from the target to the eye) so that (xAxis, yAxis, zAxis) is a right-handed orthonormal basis:

    Vector3f zAxis = eye.subtract(at).normalized();

With this modification, the rest of the computation for this matrix seems correct. For reference, see this answer for a complete derivation of the "look at" matrix.

Also, what is a good way to debug the gl_Position that gets calculated in the Vertex Shader to see if an issue is there?

In this context, the simplest way is to take one vertex (or a few vertices, or its bounding box) from your model and apply the transformation done in your vertex shader in software. Then check the result to see if it makes sense (inside the view frustum, at the right distance, etc...).

Community
  • 1
  • 1
user3146587
  • 4,250
  • 1
  • 16
  • 25
  • Tried it, however still doesn't work, need to be debugging more. – skiwi Feb 05 '14 at 21:22
  • Pass the position to the fragment shader and use it as color to see where your vertices are. (edit: more reddish means positive x axis, more greenish colors positive y... you can also scale it to avoid black for negative values and you can either use model or world coordinates, depending on what you want to check) – Sebastian Höffner Feb 05 '14 at 23:59
  • @SebastianHöffner This approach assumes that the generated vertices belong to visible primitives. Before passing any information to debug to the fragment shader, you should really make sure something is eventually going to be rasterized. Else yes, this would be the next step in debugging `gl_Position`. – user3146587 Feb 06 '14 at 00:08