4

I have a simple vertex shader, written in GLSL, and I was wondering if someone could aid me in calculating the normals for the surface. I am 'upgrading' a flat surface, so the current light model looks... weird. Here is my current code:

varying vec4 oColor;
varying vec3 oEyeNormal;
varying vec4 oEyePosition;

uniform float Amplitude;     // Amplitude of sine wave
uniform float Phase;         // Phase of sine wave
uniform float Frequency;     // Frequency of sine wave

varying float sinValue;

void main()
{
    vec4 thisPos = gl_Vertex;

    thisPos.z = sin( ( thisPos.x + Phase ) * Frequency) * Amplitude;

    // Transform normal and position to eye space (for fragment shader)
    oEyeNormal    = normalize( vec3( gl_NormalMatrix * gl_Normal ) );
    oEyePosition  = gl_ModelViewMatrix * thisPos;       

    // Transform vertex to clip space for fragment shader
    gl_Position   = gl_ModelViewProjectionMatrix * thisPos;

    sinValue = thisPos.z;
}

Does anyone have any ideas?

Josh
  • 12,448
  • 10
  • 74
  • 118

2 Answers2

7

Ok, let's just take this from the differential geometry perspective. You got a parametric surface with parameters s and t:

X(s,t) = ( s, t, A*sin((s+P)*F) )

So we first compute the tangents of this surface, being the partial derivatives after our two parameters:

Xs(s,t) = ( 1, 0, A*F*cos((s+P)*F) )
Xt(s,t) = ( 0, 1, 0 )

Then we just need to compute the cross product of these to get the normal:

N = Xs x Xt = ( -A*F*cos((s+P)*F), 0, 1 )

So your normal can be computed completely analytical, you don't actually need the gl_Normal attribute:

float angle = (thisPos.x + Phase) * Frequency;
thisPos.z = sin(angle) * Amplitude;
vec3 normal = normalize(vec3(-Amplitude*Frequency*cos(angle), 0.0, 1.0));

// Transform normal and position to eye space (for fragment shader)
oEyeNormal    = normalize( gl_NormalMatrix * normal );

The normalization of normal might not be neccessary (since we normalize the transformed normal anyway), but right at the moment I'm not sure if an unnormalized normal would behave correctly in the presence of non-uniform scaling. Of course, if you want the normal to point into the negative z-direction you need to negate it.


Well, the way over a surface in space wouldn't have been neccessary. We can also just think with the sine curve inside the x-z-plane, since the y-part of the normal is zero anyway, as only z depends on x. So we just take the tangent to the curve z=A*sin((x+P)*F), whose slope is the derivative of z, being the x-z-vector (1, A*F*cos((x+P)*F)), the normal to this is then just (-A*F*cos((x+P)*F), 1) (switch coords and negate one), being x and z of the (unnormalized) normal. Well, no 3D vectors and partial derivatives, but the outcome is the same.

Christian Rau
  • 45,360
  • 10
  • 108
  • 185
  • Thank you so much! I will try this shortly, once I boot into my Win7 partition. I'm having one hell of a time with lighting and GLSL! – Josh Mar 07 '12 at 02:31
0

Furthermore you should tweak your performance:

oEyeNormal = normalize(vec3(gl_NormalMatrix * gl_Normal));
  1. There is no need to cast it to a vec3 since gl_NormalMatrix is a 3x3 Matrix.
  2. There is no need to normalize your incoming normal in your vertex shader, since you don't do any length based calculation in it. Some sources say that incoming normals should always be normalized by the application so that there is no need for it at all in the vertex shader. But since that's out of the hands of the shader developer I still normalize them when I calculate vertex based lighting (gouraud).
djmj
  • 5,579
  • 5
  • 54
  • 92
  • 1. - Well, the cast is unneccessary, but neither does will cost any performance, since a cast of a vec3 to a vec3 should be a no-op (the GLSL compiler can't be that stupid to do a copy here). But you're right in that it's totally unneccessary. – Christian Rau Nov 02 '12 at 13:36
  • 1
    2. - That is plain wrong. He doesn't normalize the incoming normal (which is indeed not a good idea since the application should of course provide normalized, well, normals). But he normalizes the transformed normal (after multiplying with the `gl_NormalMatrix`) and that **is** neccessary to account for any scaling and shearing transformations. And of course you know that interpolating unnormalized normals across a surface results in wrong per-fragment normals. So in the end there isn't much performance to tweak here. – Christian Rau Nov 02 '12 at 13:40
  • And in the end it doesn't even answer the actual question and might have been better suited as a comment. – Christian Rau Nov 02 '12 at 13:47
  • @ChristianRau My answer is better suited as a comment since its off topic, there I agree. But you should read my answer again carefully. I said there is no need `in your vertex shader` to normalize it unless you use it in special calculations such as lightning in your vertex shader! – djmj Nov 02 '12 at 18:02
  • 2
    Yes, there is no need to normalize the input normal in the vertex shader. But the OP doesn't do that anyway. What he does is normalize the transformed normal (after multiplying by the `gl_NormalMatrix`) and **that is neccessary** to account for scaling and the like, because you need to put a properly normalized normal into the varying in order to get properly interpolated normals for the fragments. Please read my comments again carefully. – Christian Rau Nov 03 '12 at 01:41