So, I've come across some oddities between GLSL and GLM.
If I generate the following view matrix (C++):
vec3 pos(4, 1, 1);
vec3 dir(1, 0, 0);
mat4 viewMat = glm::lookAt(pos, pos+dir, vec3(0,0,1));
And then, in glsl, do:
fragColour.rgb = vec3(inverse(viewMat) * vec4(0,0,0,1)) / 4.f;
Then I expect for the screen to become pinkish-red, or (1.0,0.25,0.25). Instead, I get black.
If I do this in GLM, however:
vec3 colour = vec3(glm::inverse(viewMat) * vec4(0,0,0,1)) / 4.f;
cout << glm::to_string(colour) << endl;
I get the expect result of (1.0,0.25,0.25).
Now, if I change the viewMat to instead be (C++):
vec3 pos(4, 1, 1);
vec3 dir(1, 0.000001, 0);
mat4 viewMat = glm::lookAt(pos, pos+dir, vec3(0,0,1));
Then bam! I get (1.0,0.25,0.25) in both GLSL and GLM.
This makes no sense to me. Why does it do this? This view matrix works fine everywhere else in GLSL - I just can't invert it. This happens whenever dirY == 0.f.
Also, please suggest improvements for the question title, I'm not sure what it should be.
Edit: Also, it doesn't seem to have anything to do with lookAt's up vector (which I set to Z anyway). Even if I set up to (0,1,0), the same thing happens. Everything turns sideways, but I still can't invert the view matrix in GLSL.
Edit: Ok, so at derhass' suggestion, I tried sending the view matrix in inverted already. Bam, works perfectly. So, it seems that my GL implementation really is somehow incapable of inverting that matrix. This would have to easily be the weirdest GL bug I've ever come across. Some kind of explanation of why it's a bad idea to invert matrices in shaders would be appreciated though. EditAgain: Sending in inverted matrices throughout my engine resulted in a huge framerate boost. DEFINITELY DO THAT.