I'm not sure if this a bug or something like that, maybe my code just misuses the lookAt function, but now (for me) its behavior seems a bit odd. The part of my code in question is :
ubo.view = glm::lookAt(glm::vec3(0.0f, 0.0f, 2.0f), glm::vec3(0.0f, 0.0f, 0.0f), glm::vec3(0.0f, 0.0f, 1.0f));
Here is what I understand of it:
glm::vec3(0.0f, 0.0f, 2.0f)
means that my camera is above the center of the "world" by 2.0f, so it's coordinates are x:0, z:0, y:2, if I'm not mistaken
glm::vec3(0.0f, 0.0f, 0.0f)
means that my camera is looking at the center of the world (x:0, z:0, y:0)
glm::vec3(0.0f, 0.0f, 1.0f)
and this means that "my world's up" is on the y axis, since that's what is 1, correct?
But when I try to run it this way, instead of seeing what I rendered, I see pitch black, while if I set "my world's up" to either the x or z axis, I can see the square I'm trying to see, just fine. Now I'd usually say, "Well ig they messed up something or I don't understand this yet, or both, and I'll just move on", but then I had an idea and I tried it like this:
ubo.view = glm::lookAt(glm::vec3(0.1f, 0.1f, 2.0f), glm::vec3(0.0f, 0.0f, 0.0f), glm::vec3(0.0f, 0.0f, 1.0f));
And it worked, well, sort of, since now I'm looking at it from a little angle, which I don't want. Since my camera is not at exactly x:0 and z:0 it for some reason does its job.
So, I just want to ask you guys, why does it work this way?
Like I can clearly make it work in a way that looks like what I want, but it's still really weird for me. I mean it took me some time to figure out why are fragments drawn clockwise, so maybe it's just something I've yet to learn