1

I'm trying to stitch images by warping the images based on the homography between the two images.

I calculate my homography and get the matching points between the two images through the standard homography example. I then try and warp the images together using this calculation with this code below:

// Use the Homography Matrix to warp the images
cv::Mat result;
warpPerspective(I_1,result,H_12,cv::Size(800,600));
cv::Mat half(result,cv::Rect(0,0,I_2.cols,I_2.rows));
I_2.copyTo(half);
return result;

I_1 and I_2 being the 2 images. H_12 The homography between them.

However this results in an image like this:

enter image description here

Using the standard Stitcher class this results both images stitched correctly:

enter image description here

So I'm wondering how I can correctly warp my images based on the homography to combine them.

My full code is here

EDIT:

The way I had calculated the homography cause the images to be offset. I used the BFMatcher instead of FlannBasedMatcher and I got it working with the example linked by skeller.

I could also get it working with my method. I needed to warp the second image instead of the first. So the code above changes to this:

  cv::Mat result;
  warpPerspective(I_2,result,H_12,cv::Size(800,600));
  cv::Mat half(result,cv::Rect(0,0,I_2.cols,I_2.rows));
  I_1.copyTo(half);
  return result;
Elliot
  • 71
  • 9
  • does https://stackoverflow.com/questions/8205835/stitching-2-images-in-opencv?rq=1 help? – skeller Jul 10 '18 at 18:07
  • Thanks. Helps a bit. Only able to get the first image to show up with that method. Obviously I'm missing something. – Elliot Jul 10 '18 at 21:16
  • I know this is a month old, but I once wrote a long answer [here](https://stackoverflow.com/a/44459869/5087436) to warp two images into a new image without any boundary cutoffs, and I wrote some functions to do this in both Python and C++ which you can grab here: https://github.com/alkasm/padded-transformations-cpp – alkasm Aug 17 '18 at 17:13

0 Answers0