I'm trying to stretch the contrast of an opencv3 / numpy.ndarray picture in my Python 3 application.
I found the normalize method, but that seems to do the opposite, pushing 1 to 255 in the range 10 to 11 for example.
In numpy i found nothing easy, but in opencv i came across the convertScaleAbs
function.
I gave it
alpha=255 / (upper_boundary - lower_boundary)
as scaling and
beta=(- (lower_boundary * (255 / (upper_boundary - lower_boundary)))
as delta, but that is producing strange results where dark parts get extremely light despite entering e.g. 10 and 250.
I'm not sure whether my maths is correct or not.
I also saw this formula newValue = 255 * (oldValue - minValue)/(maxValue - minValue)
in another question, but I cant apply that to the scale and delta concept of the convertScaleAbs
function, right?
Edit:
I also tried the normalize
opencv function with alpha=lower_boundary
, beta=upper_boundary
and norm_type=cv2.NORM_MINMAX
, the result is I can enter numbers between 0 and 255 but the result image only has these brightness values and the contrast is less not stretched!
Here is a minimal complete verifiable example:
import cv2
image = cv2.imread("Tux.png")
cv2.normalize(image, image, alpha=20, beta=200, norm_type=cv2.NORM_MINMAX)
cv2.namedWindow("TestWindow", cv2.WINDOW_AUTOSIZE)
cv2.imshow("Test", image)
cv2.waitKey(0)
I'm thankful for any hints on what easy function I may have overlooked or what I could use to do this more easily (manually)!