2

I'm trying to apply a lighting per-pixel in my 3d engine but I'm having some trouble understanding what can be wrong with my geometry. I'm a beginner in OpenGL so please bear with me if my question may sound stupid, I'll explain as best as I can.

My vertex shader:

#version 400 core

layout(location = 0) in vec3 position;
in vec2 textureCoordinates;
in vec3 normal;

out vec2 passTextureCoordinates;
out vec3 normalVectorFromVertex;
out vec3 vectorFromVertexToLightSource;
out vec3 vectorFromVertexToCamera;

uniform mat4 transformation;
uniform mat4 projection;
uniform mat4 view;
uniform vec3 lightPosition;

void main(void) {
    vec4 mainPosition = transformation * vec4(position, 1.0);

    gl_Position = projection * view * mainPosition;
    passTextureCoordinates = textureCoordinates;

    normalVectorFromVertex = (transformation * vec4(normal, 1.0)).xyz;
    vectorFromVertexToLightSource = lightPosition - mainPosition.xyz;
}

My fragment-shader:

#version 400 core

in vec2 passTextureCoordinates;
in vec3 normalVectorFromVertex;
in vec3 vectorFromVertexToLightSource;

layout(location = 0) out vec4 out_Color;

uniform sampler2D textureSampler;
uniform vec3 lightColor;

void main(void) {
    vec3 versor1 = normalize(normalVectorFromVertex);
    vec3 versor2 = normalize(vectorFromVertexToLightSource);

    float dotProduct = dot(versor1, versor2);
    float lighting = max(dotProduct, 0.0);
    vec3 finalLight = lighting * lightColor;

    out_Color = vec4(finalLight, 1.0) * texture(textureSampler, passTextureCoordinates);
}

The problem: Whenever I multiply my transformation matrix for the normal vector with a homogeneous coordinate of 0.0 like so: transformation * vec4(normal, 0.0), my resulting vector is getting messed up in such a way that whenever the pipeline goes to the fragment shader, my dot product between the vector that goes from my vertex to the light source and my normal is probably outputting <= 0, indicating that the lightsource is in an angle that is >= π/2 and therefore all my pixels are outputting rgb(0,0,0,1). But for the weirdest reason that I cannot understand geometrically, if I calculate transformation * vec4(normal, 1.0) the lighting appears to work kind of fine, except for extremely weird behaviours, like 'reacting' to distance. I mean, using this very simple lighting technique the vertex brightness is completely agnostic to distance, since it would imply the calculation of the vectors length, but I'm normalizing them before applying the dot product so there is no way that this is expected to me.

One thing that is clearly wrong to me, is that my transformation matrix have the translation components applied before multiplying the normal vectors, which will "move and point" the normals in the direction of the translation, which is wrong. Still I'm not sure if I should be getting this results. Any insights are appreciated.

Bruno Giannotti
  • 323
  • 2
  • 13
  • Check this out for transforming normals: https://stackoverflow.com/questions/13654401/why-transforming-normals-with-the-transpose-of-the-inverse-of-the-modelview-matr – Varaquilex Jul 04 '20 at 06:52
  • Are your normal vectors inverted? Do they point in the object rather than out of the object? Try `transformation * vec4(-normal, 0.0)` – Rabbid76 Jul 04 '20 at 07:10
  • It's so very strange. It doesn't make any difference unfortunately. It is sad that people were so quick in closing my question as duplicate when my question is not duplicate at all. – Bruno Giannotti Jul 04 '20 at 17:48
  • @BrunoGiannotti The shader code doesn't seem to be causing the problem. There is a bug somewhere else in your code. – Rabbid76 Jul 04 '20 at 19:11
  • 1
    Right this is what I was thinking. If I could validate that the shader logic is in fact correct I can start hunting this bug. Thank you – Bruno Giannotti Jul 04 '20 at 19:13
  • 1
    Just an update. After your reply I was quickly able to identify my mistake, which was regarding the VAO attribute pointer attribution in the rendering process. It's funny that I was so convinced that it was something about my geometry in the shader code. Thank you so much for the help. – Bruno Giannotti Jul 04 '20 at 19:46

1 Answers1

2

Whenever I multiply my transformation matrix for the normal vector with a homogeneous coordinate of 0.0 like so: transformation * vec4(normal, 0.0), my resulting vector is getting messed up

What if you have non-uniform scaling in that transformation matrix?

Imagine a flat square surface, all normals are pointing up. Now you scale that surface to stretch in the horizontal direction: what would happen to normals?

If you don't adjust your transformation matrix to not have the scaling part in it, the normals will get skewed. After all, you only care about the object's orientation when considering the normals and the scale of the object is irrelevant to where the surface is pointing to.

Or think about a circle:

enter image description here

img source

You need to apply inverse transpose of the model view matrix to avoid scaling the normals when transforming the normals. Another SO question discusses it, as well as this video from Jaime King teaching Graphics with OpenGL.

Additional resources on transforming normals:

Varaquilex
  • 3,447
  • 7
  • 40
  • 60