0

I have been having trouble understanding the rotation vector given back by the Calib3d.calibrateCamera() fucntion of the opencv. In order to understand the return values from the function, I tried the following steps.

I generated a chess board pattern first in Java. Then I applied 15 rotation vectors and 15 translation vectors to it. I also applied a known matrix representing the camera's intrinsic matrix. I basically trying to recreate the manual process of taking many photographs of the chessbaord. Some of the images I created look like this: One of the images I generated with identified corners

After generating images, I ran them through the corner identification and camera calibration functions. I got back the intrinsic matrix i used almost exactly from the calibration function.

When I compare the rotation vectors I used to generate the images and the vectors I got back from the calibration function, they are negatives of each other. Comparison plots are shown below:

Comparing input and output Rotation value about x axis

Is this normal behavior or am i doing something wrong?

The answer at says that opencv rotations are from the image coordinate system to the object coordinate system. Can this be a reason?

Community
  • 1
  • 1

1 Answers1

0

If the expected and actual rotation vectors are the opposite of each other, it means the expected and actual rotation matrices are the inverse of each other. Hence you probably got the source and destination coordinate systems mixed up.

Normally, this is quite easy to check:

  1. Apply the [R|t] matrix to the 3D points corresponding to the pattern corners
  2. Project them in the image using the intrinsic matrix
  3. Check that the corner projections have consistent values with what you observe in the image. If they are not consistent, try the same thing with the inverse rotation matrix.
BConic
  • 8,750
  • 2
  • 29
  • 55
  • 1) I have checked the order i which I am giving the inputs to the openCV calibrate function. I am giving object points first and image points next as mentioned in the manual. So order of inputs should not be a problem. Order of passing Inputs: ` Calib3d.calibrateCamera(objectPoints, imagePoints, imageSize, cameraMatrix, distortionCoefficients , rvecs, tvecs,mFlags); ` 2) If I use the returned rotation vectors, translations and intrinsics to reproject the corners using Calib3d.projectPoints , they are matching very well. Any pointers? – stadatum Jun 01 '16 at 01:37