I use the following code to convert a 3X3 rotation matrix to angles :
(_r = double[9] )
double angleZ=atan2(_r[3], _r[4])* (float) (180.0 / CV_PI);
double angleX=180-asin(-1*_r[5])* (float) (180.0 / CV_PI);
double angleY=180-atan2(_r[2],_r[8])* (float) (180.0 / CV_PI);
here is a little helper
_r[0] _r[1] _r[2]
_r[3] _r[4] _r[5]
_r[6] _r[7] _r[8]
does this make any sense ? cause the angles seem too... interdependent ? x y z all react to single pose change...
the rotation matrix is received from opencv cvPOSIT function so the points of interest might be wrong and giving this confusing effect ...
but somehow i think im just doing the conversion wrong :)
I am applying the angles in opengl to a cube :
glRotatef(angleX,1.0f,0.0f,0.0f);
glRotatef(angleY,0.0f,1.0f,0.0f);
glRotatef(angleZ,0.0f,0.0f,1.0f);