0

I want to get the difference between the two images and normalise the resulting array and scaled to fit [0,255].

I was trying the following:

import cv2

img1 = cv2.imread("lko.png")
img2 = cv2.imread("jum.png")

cv2.subtract(img1, img2)

But I am not sure if this is the correct way to do this? How could I normalise the resulting array and scale to fit [0,255?]

  • you could multiply every value by 255 divided by the maximum value of that "difference array" – Azrael Jan 20 '19 at 11:30

1 Answers1

0

First, you need to define what you mean by the "difference between 2 images". It is important to notice that cv2.subtract performs saturation, which means that, if you images are in e.g. np.uint8, all negative values will be clipped to 0 (see more details in this answer). Maybe that's not what you want. But let's say it is what you want. Then, you can do what @Tilman said in the comments (you need to subtract the min too, see below). If you do not understand why, more details can be seen in this answer. Basically, your code would be like this (I did it step-by-step so that you could understand):

import cv2

img1 = cv2.imread("lko.png")
img2 = cv2.imread("jum.png")

diff = cv2.subtract(img1, img2)  # range [a, b]
diff = diff - np.min(diff)       # range [0, b-a]
diff /= np.max(diff)             # range [0, 1]
diff *= 255                      # range [0, 255]
Berriel
  • 12,659
  • 4
  • 43
  • 67