I have a simple struct "mat4", consisting of float[4][4], and a *= function to multiply 4x4 matrices. It takes a const mat4& "rhs" as follows:
this->m[0][0] = this->m[0][0] * rhs[0][0] + this->m[0][1] * rhs[1][0] + this->m[0][2] * rhs[2][0] + this->m[0][3] * rhs[3][0];
this->m[0][1] = this->m[0][0] * rhs[0][1] + this->m[0][1] * rhs[1][1] + this->m[0][2] * rhs[2][1] + this->m[0][3] * rhs[3][1];
this->m[0][2] = this->m[0][0] * rhs[0][2] + this->m[0][1] * rhs[1][2] + this->m[0][2] * rhs[2][2] + this->m[0][3] * rhs[3][2];
this->m[0][3] = this->m[0][0] * rhs[0][3] + this->m[0][1] * rhs[1][3] + this->m[0][2] * rhs[2][3] + this->m[0][3] * rhs[3][3];
this->m[1][0] = this->m[1][0] * rhs[0][0] + this->m[1][1] * rhs[1][0] + this->m[1][2] * rhs[2][0] + this->m[1][3] * rhs[3][0];
this->m[1][1] = this->m[1][0] * rhs[0][1] + this->m[1][1] * rhs[1][1] + this->m[1][2] * rhs[2][1] + this->m[1][3] * rhs[3][1];
this->m[1][2] = this->m[1][0] * rhs[0][2] + this->m[1][1] * rhs[1][2] + this->m[1][2] * rhs[2][2] + this->m[1][3] * rhs[3][2];
this->m[1][3] = this->m[1][0] * rhs[0][3] + this->m[1][1] * rhs[1][3] + this->m[1][2] * rhs[2][3] + this->m[1][3] * rhs[3][3];
this->m[2][0] = this->m[2][0] * rhs[0][0] + this->m[2][1] * rhs[1][0] + this->m[2][2] * rhs[2][0] + this->m[2][3] * rhs[3][0];
this->m[2][1] = this->m[2][0] * rhs[0][1] + this->m[2][1] * rhs[1][1] + this->m[2][2] * rhs[2][1] + this->m[2][3] * rhs[3][1];
this->m[2][2] = this->m[2][0] * rhs[0][2] + this->m[2][1] * rhs[1][2] + this->m[2][2] * rhs[2][2] + this->m[2][3] * rhs[3][2];
this->m[2][3] = this->m[2][0] * rhs[0][3] + this->m[2][1] * rhs[1][3] + this->m[2][2] * rhs[2][3] + this->m[2][3] * rhs[3][3];
this->m[3][0] = this->m[3][0] * rhs[0][0] + this->m[3][1] * rhs[1][0] + this->m[3][2] * rhs[2][0] + this->m[3][3] * rhs[3][0];
this->m[3][1] = this->m[3][0] * rhs[0][1] + this->m[3][1] * rhs[1][1] + this->m[3][2] * rhs[2][1] + this->m[3][3] * rhs[3][1];
this->m[3][2] = this->m[3][0] * rhs[0][2] + this->m[3][1] * rhs[1][2] + this->m[3][2] * rhs[2][2] + this->m[3][3] * rhs[3][2];
this->m[3][3] = this->m[3][0] * rhs[0][3] + this->m[3][1] * rhs[1][3] + this->m[3][2] * rhs[2][3] + this->m[3][3] * rhs[3][3];
I just wanted to get confirmation whether it was correct or not - when I multiply two matrices in C++ (projection * view matrices) and give the resulting matrix to the shader, I get nothing on the screen showing up.
But if I give the shader projection & view matrices separately, and multiply them in GLSL - then it all works great, results are as expected.
So there must be something wrong with the matrix multiplication function?