3

I am attempting to calibrate the extrinsics of four cameras that I have mounted on a set-up. They are pointing 90 degrees apart. I have already calibrated the intrinsic paramteres, and I am thinking of using an image of a calibration pattern to find the extrinsics. What I have done so far is: placed the calibration pattern so that it lies flat on the table, so that its roll and yaw angles are 0 and pitch is 90 (as it lies parallel with the camera). The cameras have 0,90,180,270 degrees angles yaw (as they are 90 degrees apart) and the roll angle of the cameras are 0 (as they do not tilt. So what is left to calculate is the pitch angle of the cameras.

I can't quite wrap my head around how to calculate it, as I am not used to doing mapping between coordinate systems, so any help is welcome. I have already made a part of the program that calculates the rotation vector (of the calibration pattern in the image) using the cv::solvePnPRansac() function, so I have the rotation vector (which I believe I can make into a matrix using cv::Rodrigues()

What would the next step be for me in my calculations?

NoShadowKick
  • 243
  • 1
  • 4
  • 13
  • you can use cv::calib3D and provide the intrinsics. You only have to know the pixel locations and the 3D locations (global) of your markers – Micka Jan 25 '16 at 13:41
  • 1
    you csm use the solvePnP results too (both, R and T) but those are the object transformations for one fixed camera. To compute camera extrinsics (fixed object, moving camera) you have to invert the transformation – Micka Jan 25 '16 at 13:43
  • I can't find a function called calib3D - but are you saying I should use the rvec that I get from calibration of the specific pattern I use for the extrinsics? – NoShadowKick Jan 25 '16 at 14:48
  • sorry, it is called `cv::calibrateCamera` http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#calibratecamera . no you can't use the rvec (AND tvec!) from solvePnP for your extrinsics, you have to "invert" them first. Try to reproduce that moving the camera and moving/rotating the object are some kind of inverse operations. – Micka Jan 25 '16 at 14:58
  • Okay, thanks for the help, that was what I ment, because *cv::calibrateCamera()* or *cv::fisheye::calibrate()* which I use outputs a vector of rotation (and translation) vectors, so I will use the vector corresponding to the calibration image I am using for further calculations. – NoShadowKick Jan 25 '16 at 15:02

0 Answers0