0

I am trying to implement normal mapping and lighting in my fragment shader. But my codes do not seem to work. My idea is to pass an extra copy of ModelMatrix to the fragment shader so that it can transform the normals stored in the texture. But the reflected light actually occurs on the other side.

[Updates] After getting some advice, I calculated the TBN matrix in the vertex shader, transform the TBN basis to world space, convert lightPosition and eyePosition to tangentSpace. But this time the planet does not even light at all! Only the ambient light shows up. Then I found out that it was because the diffuse component evaluates to negative values, which are then clamped to 0. I can not seem to find the bug in the shader.

VertexShader:

#version 400

in layout(location=0) vec3 vertexPosition;
in layout(location=1) vec2 vertexUV;
in layout(location=2) vec3 vertexNormal;
in layout(location=3) vec3 vertexTangent;
in layout(location=4) vec3 vertexBitangent;

out vec2 UV;
out vec3 vertexPositionWorld;
out vec3 tangentLightPos;
out vec3 tangentViewPos;
out vec3 tangentFragPos;

uniform vec3 lightPositionWorld;
uniform vec3 eyePositionWorld;

uniform mat4 M;
uniform mat4 V;
uniform mat4 P;

void main()
{
gl_Position = P * V * M * vec4(vertexPosition, 1.0f);
vertexPositionWorld = vec3(M * vec4(vertexPosition, 1.0f));
UV = vertexUV;

// note scaling problem here 
mat3 normalMatrix = transpose(inverse(mat3(M)));
// mat3 normalMatrix = inverse(transpose(mat3(M)));
vec3 T = normalize(normalMatrix * vertexTangent);
vec3 B = normalize(normalMatrix * vertexBitangent);
vec3 N = normalize(normalMatrix * vertexNormal);

mat3 TBN = transpose(mat3(T, B, N));
tangentLightPos = TBN * lightPositionWorld;
tangentViewPos = TBN * eyePositionWorld;
tangentFragPos = TBN * vertexPositionWorld;
}

FragmentShader:

#version 400

uniform sampler2D textureSampler_1;
uniform sampler2D textureSampler_2;
uniform vec3 AmbientLightPower;
uniform vec3 DiffuseLightPower;
uniform vec3 SpecularLightPower;
uniform float specularLightPower;

in vec2 UV;
in vec3 vertexPositionWorld;
in vec3 tangentLightPos;
in vec3 tangentViewPos;
in vec3 tangentFragPos;

out vec4 finalColor;

void main()
{
vec3 normal = texture(textureSampler_2, UV).rgb;
normal = normalize(normal * 2.0 - 1.0);

vec3 MaterialAmbientColor = texture( textureSampler_1, UV ).rgb;
vec3 MaterialDiffuseColor = texture( textureSampler_1, UV ).rgb;
vec3 MaterialSpecularColor = vec3(0.3,0.3,0.3);
// diffuse light
vec3 lightDirection = normalize(tangentLightPos - tangentFragPos);
float DiffuseBrightness = clamp(dot(lightDirection, normal), 0, 1);

// specular light
vec3 reflectedDirection = normalize(reflect(-lightDirection, normal));
vec3 viewDirection = normalize(tangentViewPos - tangentFragPos);
float SpecularBrightness = clamp(dot(reflectedDirection, viewDirection), 0, 1);

finalColor = vec4(

MaterialAmbientColor  * AmbientLightPower +

MaterialDiffuseColor  * DiffuseLightPower * DiffuseBrightness +

MaterialSpecularColor * SpecularLightPower * pow(SpecularBrightness, specularLightPower), 1.0f);
}
Louis Kuang
  • 727
  • 1
  • 14
  • 30
  • ModelMatrixForFrag it's inverse and Transpose from model matrix? – quazeeee Nov 14 '16 at 14:28
  • It is just the ModelMatrix, it is just that I want to have one in the fragment shader. Then I do not need to pass it as an output in the vertex shader. Which is more efficient? – Louis Kuang Nov 14 '16 at 14:29
  • 1
    all uniform variables can be used in all shaders, you don't need two different uniform for identical data. But for transform normal data we don't need use model matrix, becouse model matrix couse error. instead model matrix we ned use inverse and transpose model matrix. modelMatrixFornormal = transpose(inverse(modelMatrix)); – quazeeee Nov 14 '16 at 14:42
  • Do coordinate space transformation into tangent space in your vertex shader instead of repeating the same calculation over and over per-fragment. In tangent space, the lighting vectors will interpolate the same as texture coordinates and you can use attribute interpolation rather than calculating stuff over and over. – Andon M. Coleman Nov 14 '16 at 21:55
  • Hey @quazeeee, why do I have to inverse and transpose the model matrix? – Louis Kuang Nov 15 '16 at 11:20
  • i just get toy link to another question http://stackoverflow.com/questions/13654401/what-is-the-logic-behind-transforming-normals-with-the-transpose-of-the-inverse – quazeeee Nov 15 '16 at 11:43

1 Answers1

3

Your approach is taking the normal map, and transforming it by the model matrix. This is valid only when the normal map is a world-space normal map. World-space normal maps are rarely used, typically normal maps are tangent-space normal maps. If you're not sure which type of normal map you have, then you can examine the texture - if it looks mostly blue then you very probably have a tangent-space normal map, if it's very multicoloured, then it's probably world-space and my answer is wrong for you.

To work with tangent-space normal maps, you need to do a bit more work. Typically, in the vertex shader you get some basis vectors (Normal, Binormal and Tangent aka NBT), and you use those vectors to either transform the normal map into world space, or to transform the light position into tangent space (the latter is often more efficient for low numbers of lights because you do more work in the vertex shader).

Tangent space is tricky to get your head around, and I'd recommend reading around the subject a bit. This article might be a decent start.

Columbo
  • 6,648
  • 4
  • 19
  • 30
  • I think my normal map is probably in tangent space. I followed this tutorial http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-13-normal-mapping/ . But strangely it does not work for me. I wonder if I am understanding the camera_space in this tutorial wrong. What I did was I directly apply the transpose of the TBN matrix on my camera position and eye position. – Louis Kuang Nov 15 '16 at 02:40
  • Hi! Can you take a look at my updated codes? Thank you! – Louis Kuang Nov 17 '16 at 12:41