0

I have two versions of an image. One image is not rotated, while the other image is rotated. How can I measure the degree of rotation of the second image with respect to the first image in Python?

I looked around, but couldn't find a clear method to do that. For instance, I checked this answer, but when I applied it on my non-rotated image, I got an angle of around -70 returned, while I expected 0. For another rotated image I have it also gave me the wrong angle. Apart from that I would like to compare the rotated image with an some reference image, which I believe the code doesn't include.

I also checked this answer, but couldn't grasp the idea of how I can measure the rotation with respect to the original (reference) image.

Thanks for your kind support

Simplicity
  • 47,404
  • 98
  • 256
  • 385
  • Are the images `rotated` exactly on the center, or can this vary? – user1767754 Nov 28 '17 at 01:19
  • Post some sample images. – zindarod Nov 28 '17 at 01:30
  • @user1767754 Yes, I believe that most of my images are rotated on the center. – Simplicity Nov 28 '17 at 01:35
  • @zindarod Sure, I have added a sample image. – Simplicity Nov 28 '17 at 01:36
  • Add the non rotated one as well.... and make sure those are real data. One method would be using `SIFT` algorithm to track features and see how far they went apart. – user1767754 Nov 28 '17 at 02:15
  • 1
    Search for "image alignment in opencv" to find a lot of tutorials for this. [This](https://stackoverflow.com/questions/22086062/align-images-in-opencv) answer has a lot of suggestions for the various ways of accomplishing this. The two general methods are *dense* optical flow, which looks to align each pixel, and *sparse* optical flow, which looks to align features. Feature based methods are faster and more robust to different illumination conditions. If your images are *purely* rotated, then you can simply rotate your image in a loop and do template matching. – alkasm Nov 28 '17 at 02:21

0 Answers0