I am trying to configure a camera in pyBullet based on intrinsic and extrinsic parameters obtained from calibrating a real camera.
What I have
The camera is calibrated with OpenCV, giving me a camera matrix
|f_x 0 c_x|
| 0 f_y c_y|
| 0 0 1 |
and a vector of distortion coefficients
(k_1, k_2, p_1, p_2, k_3)
(I further have the pose of the camera but this is not relevant for the actual question, so I leave it out here.)
What I already did
Unfortunately the computeProjectionMatrix
function of pyBullet is a bit limited. It assumes f_x = f_y
and c_x, c_y
being exactly at the center of the image, which is both not true for my camera. Therefore I compute the projection matrix myself as follows (based on this):
projection_matrix = [
[2/w * f_x, 0, (w - 2c_x)/w, 0],
[0, 2/h * f_y, (2c_y - h)/h, 0],
[0, 0, A, B],
[0, 0, -1, 0],
]
where w,h
are width and height of the image, A = (near + far)/(near - far)
and B = 2 * near * far / (near - far)
, near
and far
defining the range on the z-axis that is included in the image (see pybullet.computeProjectionMatrix
).
What is still missing (my actual question)
The above already gives me better results but the rendered images it still don't match exactly with the real images. I suspect one reason for this might be that the distortion is not taken into account.
So finally coming to my question:
How can I implement distortion for the simulated camera using the parameters I got from calibrating the real camera?
Is there a way I can integrate this in the projection matrix? If not, is there an other way?