1

I want to calculate the epipolar lines for the interest points between two images. I am working on a fountain dataset, so I have the rotation and translation matrix, as well as the camera matrix. I currently use Matlab in order to be fast, but the version I have is quite old(2009).

I am calculating the essential matrix through E=t*R and then the epipolar line with l=E*P, where P is the interest point/set of interest points. Then I get a vector with three lines which I guess are the line parameters of ax+by+c=0. The epipolar line drawn on the right image is totally wrong, far away from the point on the left image. Any idea???

Edit: Used dataset --> fountain benchmark, images 0000 and 0001 http://cvlabwww.epfl.ch/~strecha/multiview/denseMVS.html

Output: Essential matrix e.g. for point P1=[433.36;861.15;1]

E =

0.761857065048902  1.969487475012598 40.418915885686594

-0.927781947178923 0.698934833377211 33.173562943087106

-45.044061511303227 -26.573128396975097 1.000000000000000

It has two complex eigenvalues that are conjugated.

Epipolar line:1.0e+004 *

0.206660143270238 0.023299771007641 -4.240274401559348

RiaD
  • 46,822
  • 11
  • 79
  • 123
user_3849
  • 106
  • 1
  • 11
  • The issues you're having could result from any number of things: not understanding the concepts, using the wrong operations in Matlab, poor quality images, etc. Providing a sample of what you're working on (code, images, output, etc..) would certainly help. If you're question isn't resolved by the time I get off work, I will try to provide a worked out solution. – Tyler Morrow Jul 29 '13 at 17:28
  • Thanks! You are right. Question edited. – user_3849 Jul 29 '13 at 18:16

2 Answers2

1

Finally I found the solution to my problem. I post it here in case somebody else is interested.

To calculate correctly the relative rotation and translation matrices, the Roto-Translation matrix has to be used. This matrix is a 4x4 matrix for every image. The upper left part is the rotation (wrt the world coordinate system), the 4th sub-column is the translation vector (wrt to the world coordinate system) and the last row is [0 0 0 1]. So, if we have 2 such matrices for 2 images, the final roto-translation matrix is Qright-->left=inv(Qright)*Qleft. From this matrix, we extract the relative translation (t) and rotation(R) (4th sub column and upper left matrix respectively). Then, we create the skew symmetric matrix T for translation. The epipolar matrix is E=R*T. But this isn't enough. In order to calculate correctly the epipolar lines, the Fundamental matrix F has to be found. For a given dataset such the one I used, camera matrices K are given so this is easy: F=inv(Kright')*E*inv(Kleft), where (') is the transposed and inv is the inverted matrix. Then, the epipolar lines of the right image are calculated lines=F*P, where P is the point in homogeneous coordinates.

Thank you!

user_3849
  • 106
  • 1
  • 11
0

There are lots of documents that can found online that explain epipolar geometry and how to find epipolar lines in stereo images. Here is one. It walks you through different concepts decently. The trick to this topic, I found, is keeping track of the variables which are ultimately the result of matrix transformations and implied (professor shortcuts) algabraic operations.

My recommendation would be looking at page 12 of the link I've provided and applying it your scenario. Without any data to go off of other than the description you've provided, it's impossible to work out the problem.

Good luck.

Note: sorry to hear your Matlab version is old. I know that 2013 has built in functions for this stuff, but I'm not sure if 2009 does because MathWorks requries an account to read older documentation.

Tyler Morrow
  • 949
  • 8
  • 31
  • Thanks! Yeah, I have already checked millions of epipolar geometry manuals. The problem is that I am doing something wrong during implementation. You can check the edit above. – user_3849 Jul 29 '13 at 18:18
  • I know that in MATLAB 2013 you can compute with ready function all this stuff, bit in any case I should try that way as well. – user_3849 Jul 29 '13 at 18:19
  • Ok, I am still looking at this problem. I guess it is a problem of the rotation and the translation matrices I provided in E=t*R. The dataset I use includes some .camera files that give information about R and t wrt the world reference system. Now, I have to find the relative translation and rotation between the cameras. I did it like the second answer here http://stackoverflow.com/questions/12283501/how-to-calculate-extrinsic-parameters-of-one-camera-relative-to-the-second-camer but still I don't have good results. Any recommentation? – user_3849 Aug 02 '13 at 12:27