I have been having trouble understanding the rotation vector given back by the Calib3d.calibrateCamera() fucntion of the opencv. In order to understand the return values from the function, I tried the following steps.
I generated a chess board pattern first in Java. Then I applied 15 rotation vectors and 15 translation vectors to it. I also applied a known matrix representing the camera's intrinsic matrix. I basically trying to recreate the manual process of taking many photographs of the chessbaord. Some of the images I created look like this: One of the images I generated with identified corners
After generating images, I ran them through the corner identification and camera calibration functions. I got back the intrinsic matrix i used almost exactly from the calibration function.
When I compare the rotation vectors I used to generate the images and the vectors I got back from the calibration function, they are negatives of each other. Comparison plots are shown below:
Comparing input and output Rotation value about x axis
Is this normal behavior or am i doing something wrong?
The answer at says that opencv rotations are from the image coordinate system to the object coordinate system. Can this be a reason?